Nov 25 09:04:26 crc systemd[1]: Starting Kubernetes Kubelet... Nov 25 09:04:26 crc restorecon[4527]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 09:04:26 crc restorecon[4527]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 25 09:04:26 crc kubenswrapper[4565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.966672 4565 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969053 4565 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969071 4565 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969076 4565 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969081 4565 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969085 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969090 4565 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969100 4565 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969104 4565 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969108 4565 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969112 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969116 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969120 4565 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969124 4565 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969128 4565 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969133 4565 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969138 4565 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969142 4565 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969146 4565 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969150 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969157 4565 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969161 4565 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969164 4565 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969167 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969171 4565 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969174 4565 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969178 4565 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969183 4565 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969187 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969190 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969193 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969197 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969200 4565 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969207 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969211 4565 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969215 4565 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969218 4565 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969222 4565 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969227 4565 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969230 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969233 4565 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969237 4565 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969240 4565 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969244 4565 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969247 4565 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969253 4565 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969257 4565 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969260 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969264 4565 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969268 4565 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969271 4565 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969275 4565 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969279 4565 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969282 4565 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969285 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969290 4565 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969295 4565 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969302 4565 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969306 4565 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969310 4565 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969314 4565 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969317 4565 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969321 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969325 4565 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969329 4565 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969333 4565 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969337 4565 feature_gate.go:330] unrecognized feature gate: Example Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969340 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969346 4565 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969351 4565 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969355 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.969360 4565 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970244 4565 flags.go:64] FLAG: --address="0.0.0.0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970259 4565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970284 4565 flags.go:64] FLAG: --anonymous-auth="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970289 4565 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970294 4565 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970298 4565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970303 4565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970311 4565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970315 4565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970320 4565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970324 4565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970329 4565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970333 4565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970337 4565 flags.go:64] FLAG: --cgroup-root="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970341 4565 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970345 4565 flags.go:64] FLAG: --client-ca-file="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970350 4565 flags.go:64] FLAG: --cloud-config="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970354 4565 flags.go:64] FLAG: --cloud-provider="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970358 4565 flags.go:64] FLAG: --cluster-dns="[]" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970363 4565 flags.go:64] FLAG: --cluster-domain="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970367 4565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970371 4565 flags.go:64] FLAG: --config-dir="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970375 4565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970379 4565 flags.go:64] FLAG: --container-log-max-files="5" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970385 4565 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970389 4565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970393 4565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970398 4565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970402 4565 flags.go:64] FLAG: --contention-profiling="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970406 4565 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970410 4565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970414 4565 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970418 4565 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970426 4565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970430 4565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970434 4565 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970438 4565 flags.go:64] FLAG: --enable-load-reader="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970442 4565 flags.go:64] FLAG: --enable-server="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970446 4565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970452 4565 flags.go:64] FLAG: --event-burst="100" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970455 4565 flags.go:64] FLAG: --event-qps="50" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970460 4565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970466 4565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970470 4565 flags.go:64] FLAG: --eviction-hard="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970475 4565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970479 4565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970483 4565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970487 4565 flags.go:64] FLAG: --eviction-soft="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970491 4565 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970495 4565 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970501 4565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970505 4565 flags.go:64] FLAG: --experimental-mounter-path="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970509 4565 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970524 4565 flags.go:64] FLAG: --fail-swap-on="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970528 4565 flags.go:64] FLAG: --feature-gates="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970533 4565 flags.go:64] FLAG: --file-check-frequency="20s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970537 4565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970541 4565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970547 4565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970551 4565 flags.go:64] FLAG: --healthz-port="10248" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970555 4565 flags.go:64] FLAG: --help="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970560 4565 flags.go:64] FLAG: --hostname-override="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970564 4565 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970568 4565 flags.go:64] FLAG: --http-check-frequency="20s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970572 4565 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970576 4565 flags.go:64] FLAG: --image-credential-provider-config="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970580 4565 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970586 4565 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970590 4565 flags.go:64] FLAG: --image-service-endpoint="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970593 4565 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970597 4565 flags.go:64] FLAG: --kube-api-burst="100" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970601 4565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970605 4565 flags.go:64] FLAG: --kube-api-qps="50" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970609 4565 flags.go:64] FLAG: --kube-reserved="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970613 4565 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970617 4565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970623 4565 flags.go:64] FLAG: --kubelet-cgroups="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970627 4565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970631 4565 flags.go:64] FLAG: --lock-file="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970635 4565 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970639 4565 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970643 4565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970649 4565 flags.go:64] FLAG: --log-json-split-stream="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970653 4565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970659 4565 flags.go:64] FLAG: --log-text-split-stream="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970662 4565 flags.go:64] FLAG: --logging-format="text" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970666 4565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970671 4565 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970675 4565 flags.go:64] FLAG: --manifest-url="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970678 4565 flags.go:64] FLAG: --manifest-url-header="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970684 4565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970688 4565 flags.go:64] FLAG: --max-open-files="1000000" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970694 4565 flags.go:64] FLAG: --max-pods="110" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970699 4565 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970703 4565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970706 4565 flags.go:64] FLAG: --memory-manager-policy="None" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970710 4565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970715 4565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970720 4565 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970724 4565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970737 4565 flags.go:64] FLAG: --node-status-max-images="50" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970741 4565 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970745 4565 flags.go:64] FLAG: --oom-score-adj="-999" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970749 4565 flags.go:64] FLAG: --pod-cidr="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970753 4565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970759 4565 flags.go:64] FLAG: --pod-manifest-path="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970763 4565 flags.go:64] FLAG: --pod-max-pids="-1" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970769 4565 flags.go:64] FLAG: --pods-per-core="0" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970773 4565 flags.go:64] FLAG: --port="10250" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970777 4565 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970781 4565 flags.go:64] FLAG: --provider-id="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970785 4565 flags.go:64] FLAG: --qos-reserved="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970789 4565 flags.go:64] FLAG: --read-only-port="10255" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970793 4565 flags.go:64] FLAG: --register-node="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970797 4565 flags.go:64] FLAG: --register-schedulable="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970801 4565 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970809 4565 flags.go:64] FLAG: --registry-burst="10" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970813 4565 flags.go:64] FLAG: --registry-qps="5" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970816 4565 flags.go:64] FLAG: --reserved-cpus="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970821 4565 flags.go:64] FLAG: --reserved-memory="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970825 4565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970830 4565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970834 4565 flags.go:64] FLAG: --rotate-certificates="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970841 4565 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970845 4565 flags.go:64] FLAG: --runonce="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970848 4565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970852 4565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970857 4565 flags.go:64] FLAG: --seccomp-default="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970861 4565 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970865 4565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970869 4565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970874 4565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970880 4565 flags.go:64] FLAG: --storage-driver-password="root" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970884 4565 flags.go:64] FLAG: --storage-driver-secure="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970888 4565 flags.go:64] FLAG: --storage-driver-table="stats" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970893 4565 flags.go:64] FLAG: --storage-driver-user="root" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970897 4565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970901 4565 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970905 4565 flags.go:64] FLAG: --system-cgroups="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970909 4565 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970918 4565 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970922 4565 flags.go:64] FLAG: --tls-cert-file="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970941 4565 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970946 4565 flags.go:64] FLAG: --tls-min-version="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970950 4565 flags.go:64] FLAG: --tls-private-key-file="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970954 4565 flags.go:64] FLAG: --topology-manager-policy="none" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970961 4565 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970967 4565 flags.go:64] FLAG: --topology-manager-scope="container" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970971 4565 flags.go:64] FLAG: --v="2" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.970980 4565 flags.go:64] FLAG: --version="false" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.971016 4565 flags.go:64] FLAG: --vmodule="" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.971021 4565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.971026 4565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971601 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971610 4565 feature_gate.go:330] unrecognized feature gate: Example Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971614 4565 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971618 4565 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971622 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971626 4565 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971629 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971633 4565 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971636 4565 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971640 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971645 4565 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971649 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971653 4565 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971656 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971660 4565 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971663 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971666 4565 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971670 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971674 4565 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971677 4565 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971680 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971683 4565 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971687 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971690 4565 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971693 4565 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971697 4565 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971700 4565 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971703 4565 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971706 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971710 4565 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971713 4565 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971716 4565 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971719 4565 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971722 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971726 4565 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971729 4565 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971732 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971735 4565 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971738 4565 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971741 4565 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971745 4565 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971749 4565 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971754 4565 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971758 4565 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971762 4565 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971767 4565 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971770 4565 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971773 4565 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971776 4565 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971779 4565 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971783 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971786 4565 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971790 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971793 4565 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971796 4565 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971800 4565 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971804 4565 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971807 4565 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971810 4565 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971814 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971817 4565 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971820 4565 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971825 4565 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971828 4565 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971832 4565 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971836 4565 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971840 4565 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971843 4565 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971846 4565 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971849 4565 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.971852 4565 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.972269 4565 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.978807 4565 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.978835 4565 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978901 4565 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978914 4565 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978918 4565 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978921 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978938 4565 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978943 4565 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978946 4565 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978950 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978953 4565 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978956 4565 feature_gate.go:330] unrecognized feature gate: Example Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978959 4565 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978963 4565 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978966 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978969 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978972 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978976 4565 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978979 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978982 4565 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978985 4565 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978988 4565 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978991 4565 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.978995 4565 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979000 4565 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979003 4565 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979006 4565 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979009 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979012 4565 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979016 4565 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979019 4565 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979022 4565 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979027 4565 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979031 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979035 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979039 4565 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979044 4565 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979048 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979053 4565 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979057 4565 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979061 4565 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979064 4565 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979068 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979071 4565 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979074 4565 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979077 4565 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979080 4565 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979083 4565 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979087 4565 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979090 4565 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979093 4565 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979096 4565 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979099 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979103 4565 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979106 4565 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979109 4565 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979113 4565 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979117 4565 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979121 4565 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979125 4565 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979131 4565 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979135 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979138 4565 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979142 4565 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979145 4565 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979149 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979152 4565 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979156 4565 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979159 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979162 4565 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979165 4565 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979168 4565 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979172 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.979178 4565 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979292 4565 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979299 4565 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979303 4565 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979307 4565 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979310 4565 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979314 4565 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979317 4565 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979320 4565 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979323 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979327 4565 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979330 4565 feature_gate.go:330] unrecognized feature gate: Example Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979333 4565 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979337 4565 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979342 4565 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979345 4565 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979349 4565 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979352 4565 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979355 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979359 4565 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979362 4565 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979365 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979369 4565 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979372 4565 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979375 4565 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979378 4565 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979382 4565 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979386 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979390 4565 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979393 4565 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979397 4565 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979400 4565 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979404 4565 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979407 4565 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979411 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979415 4565 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979418 4565 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979422 4565 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979425 4565 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979429 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979432 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979435 4565 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979438 4565 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979441 4565 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979444 4565 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979448 4565 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979451 4565 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979455 4565 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979459 4565 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979462 4565 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979466 4565 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979469 4565 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979473 4565 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979476 4565 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979480 4565 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979483 4565 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979486 4565 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979490 4565 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979493 4565 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979496 4565 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979499 4565 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979502 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979506 4565 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979509 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979521 4565 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979524 4565 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979528 4565 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979531 4565 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979534 4565 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979537 4565 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979541 4565 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 09:04:26 crc kubenswrapper[4565]: W1125 09:04:26.979545 4565 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.979551 4565 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.979980 4565 server.go:940] "Client rotation is on, will bootstrap in background" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.983232 4565 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.983317 4565 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.984222 4565 server.go:997] "Starting client certificate rotation" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.984247 4565 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.984805 4565 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-08 05:15:43.584825411 +0000 UTC Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.984876 4565 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1052h11m16.599951239s for next certificate rotation Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.997449 4565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 09:04:26 crc kubenswrapper[4565]: I1125 09:04:26.999204 4565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.005796 4565 log.go:25] "Validated CRI v1 runtime API" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.023534 4565 log.go:25] "Validated CRI v1 image API" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.024576 4565 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.028437 4565 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-25-09-00-15-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.028457 4565 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.039814 4565 manager.go:217] Machine: {Timestamp:2025-11-25 09:04:27.038525515 +0000 UTC m=+0.241020673 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2445406 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:717cb293-950d-4b28-956b-07370f319336 BootID:d91d380a-1f82-4c23-9139-1b88f9b7dd73 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:3076108 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4e:7b:4f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:4e:7b:4f Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:6d:3f:2e Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:13:0c:42 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:a8:90:2e Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:fb:80:bd Speed:-1 Mtu:1436} {Name:enp7s0.23 MacAddress:52:54:00:3e:24:8b Speed:-1 Mtu:1436} {Name:eth10 MacAddress:3e:71:68:f5:e8:54 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:70:92:15:30:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.039965 4565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.040039 4565 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041076 4565 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041221 4565 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041243 4565 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041388 4565 topology_manager.go:138] "Creating topology manager with none policy" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041396 4565 container_manager_linux.go:303] "Creating device plugin manager" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041693 4565 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041718 4565 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041786 4565 state_mem.go:36] "Initialized new in-memory state store" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.041847 4565 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.043865 4565 kubelet.go:418] "Attempting to sync node with API server" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.043886 4565 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.043937 4565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.043948 4565 kubelet.go:324] "Adding apiserver pod source" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.043957 4565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.046069 4565 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.046212 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.046212 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.046316 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.046293 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.046683 4565 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.048001 4565 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.048910 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049001 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049050 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049094 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049141 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049183 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049223 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049272 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049330 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049375 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049419 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049465 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.049924 4565 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.050280 4565 server.go:1280] "Started kubelet" Nov 25 09:04:27 crc systemd[1]: Started Kubernetes Kubelet. Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.051256 4565 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.051597 4565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.052346 4565 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.052448 4565 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.052987 4565 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053020 4565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053611 4565 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 19:01:29.826265829 +0000 UTC Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053707 4565 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 969h57m2.77256124s for next certificate rotation Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053814 4565 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053831 4565 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.053960 4565 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.054841 4565 server.go:460] "Adding debug handlers to kubelet server" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.055241 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.055300 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.055348 4565 factory.go:55] Registering systemd factory Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.055362 4565 factory.go:221] Registration of the systemd container factory successfully Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.055486 4565 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.056061 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="200ms" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.056155 4565 factory.go:153] Registering CRI-O factory Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.056166 4565 factory.go:221] Registration of the crio container factory successfully Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.056220 4565 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.056238 4565 factory.go:103] Registering Raw factory Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.056256 4565 manager.go:1196] Started watching for new ooms in manager Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.061235 4565 manager.go:319] Starting recovery of all containers Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.060765 4565 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.129:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b348ed17c33bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 09:04:27.050259389 +0000 UTC m=+0.252754527,LastTimestamp:2025-11-25 09:04:27.050259389 +0000 UTC m=+0.252754527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064528 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064620 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064680 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064738 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064802 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064853 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064908 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.064987 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065040 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065091 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065149 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065225 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065274 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065327 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065374 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065430 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065479 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065540 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065590 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065637 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065689 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065738 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065790 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065838 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065884 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.065978 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066047 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066102 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066152 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066199 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066248 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066300 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066349 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066396 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066444 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066494 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066560 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066610 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066658 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066705 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066753 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066806 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066860 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066912 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.066983 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067032 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067079 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067134 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067182 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067228 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067275 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067322 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067379 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067436 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067495 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067561 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067612 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067665 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067714 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067761 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.067809 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068315 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068370 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068429 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068479 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068541 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068596 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068645 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068701 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068751 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068797 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068843 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068892 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.068966 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069018 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069065 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069115 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069162 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069208 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069265 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.069317 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.070955 4565 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.070980 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.070994 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071003 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071011 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071018 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071026 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071033 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071041 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071049 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071056 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071065 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071073 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071080 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071087 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071095 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071105 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071114 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071123 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071130 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071138 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071146 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071154 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071161 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071173 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071192 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071200 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071212 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071221 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071229 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071237 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071246 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071255 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071263 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071271 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071278 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071286 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071293 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071301 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071308 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071316 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071323 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071330 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071338 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071359 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071367 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071375 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071382 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071389 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071398 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071405 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071413 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071420 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071428 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071437 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071444 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071452 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071627 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071635 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071646 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071654 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071662 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071669 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071677 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071684 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071692 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071700 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071707 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071715 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071722 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071730 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071738 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071748 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071756 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071763 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071772 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071779 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071786 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071794 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071801 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071809 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071816 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071824 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071832 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071840 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071848 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071856 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071865 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071873 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071883 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071891 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071901 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071910 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.071917 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073357 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073378 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073387 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073406 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073415 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073425 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073434 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073442 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073450 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073458 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073466 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073474 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073481 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073491 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073499 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073507 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073525 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073533 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073541 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073550 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073559 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073758 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073765 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073773 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073781 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073788 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073795 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073802 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073810 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073869 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073885 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073896 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073906 4565 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073914 4565 reconstruct.go:97] "Volume reconstruction finished" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.073920 4565 reconciler.go:26] "Reconciler: start to sync state" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.076089 4565 manager.go:324] Recovery completed Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.083563 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.084436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.084463 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.084473 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.085020 4565 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.085082 4565 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.085134 4565 state_mem.go:36] "Initialized new in-memory state store" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.088855 4565 policy_none.go:49] "None policy: Start" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.090275 4565 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.090301 4565 state_mem.go:35] "Initializing new in-memory state store" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.094688 4565 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.095943 4565 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.095968 4565 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.095989 4565 kubelet.go:2335] "Starting kubelet main sync loop" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.096022 4565 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.097508 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.097570 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.134709 4565 manager.go:334] "Starting Device Plugin manager" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.134755 4565 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.134767 4565 server.go:79] "Starting device plugin registration server" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.135067 4565 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.135085 4565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.135362 4565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.135448 4565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.135461 4565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.141499 4565 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.196546 4565 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.196608 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197274 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197355 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197499 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197556 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197877 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197910 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.197921 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198073 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198257 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198305 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198326 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198639 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198660 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198670 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198742 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198879 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.198912 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199362 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199395 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199462 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199475 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199484 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199487 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199549 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199583 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199593 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199708 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199734 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.199999 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200024 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200034 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200135 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200158 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200713 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200729 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200857 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200885 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.200895 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.235678 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.236284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.236307 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.236317 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.236331 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.236703 4565 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.129:6443: connect: connection refused" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.256869 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="400ms" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276208 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276314 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276389 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276523 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276594 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276659 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276760 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276827 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276892 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.276979 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.277046 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.277121 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.277182 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.277241 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.277298 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378111 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378148 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378167 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378180 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378193 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378208 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378221 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378233 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378247 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378261 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378273 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378285 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378296 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378309 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378321 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378577 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378622 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378647 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378675 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378686 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378721 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378748 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378773 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378774 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378791 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378803 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378820 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378828 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.378846 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.437071 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.438035 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.438134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.438213 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.438300 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.438666 4565 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.129:6443: connect: connection refused" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.524047 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.530144 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.542945 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.545083 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8c590b4df349ee9c09dcdd726faf16232ac304787b72cb529c4e84e8f32465a5 WatchSource:0}: Error finding container 8c590b4df349ee9c09dcdd726faf16232ac304787b72cb529c4e84e8f32465a5: Status 404 returned error can't find the container with id 8c590b4df349ee9c09dcdd726faf16232ac304787b72cb529c4e84e8f32465a5 Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.548118 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ee8d25f4a8f7329ee2b859eb322cbcf2341d7fe58ad9552f31e54edbf2741129 WatchSource:0}: Error finding container ee8d25f4a8f7329ee2b859eb322cbcf2341d7fe58ad9552f31e54edbf2741129: Status 404 returned error can't find the container with id ee8d25f4a8f7329ee2b859eb322cbcf2341d7fe58ad9552f31e54edbf2741129 Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.555287 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-bc31655e8007965fb28341200d80419c4ab8e563d0aa9f7ecb0598145c849c07 WatchSource:0}: Error finding container bc31655e8007965fb28341200d80419c4ab8e563d0aa9f7ecb0598145c849c07: Status 404 returned error can't find the container with id bc31655e8007965fb28341200d80419c4ab8e563d0aa9f7ecb0598145c849c07 Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.562338 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.566768 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.570066 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3c4645663fa7b3220917511385323b8abe3b6112af92794df9656b0565811f67 WatchSource:0}: Error finding container 3c4645663fa7b3220917511385323b8abe3b6112af92794df9656b0565811f67: Status 404 returned error can't find the container with id 3c4645663fa7b3220917511385323b8abe3b6112af92794df9656b0565811f67 Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.574260 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f508f099af64214f180eab37c022b90c06b79c474fcb03dc731666f85d8c2c45 WatchSource:0}: Error finding container f508f099af64214f180eab37c022b90c06b79c474fcb03dc731666f85d8c2c45: Status 404 returned error can't find the container with id f508f099af64214f180eab37c022b90c06b79c474fcb03dc731666f85d8c2c45 Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.658123 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="800ms" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.839197 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.839882 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.839906 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.839916 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:27 crc kubenswrapper[4565]: I1125 09:04:27.840014 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.840270 4565 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.129:6443: connect: connection refused" node="crc" Nov 25 09:04:27 crc kubenswrapper[4565]: W1125 09:04:27.894692 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:27 crc kubenswrapper[4565]: E1125 09:04:27.894902 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.053732 4565 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.102234 4565 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5" exitCode=0 Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.102315 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.102409 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8c590b4df349ee9c09dcdd726faf16232ac304787b72cb529c4e84e8f32465a5"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.102498 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.105564 4565 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="aba0a33b25941dc36369215b2f038d707c968205e863d2c57e5560b8ff5e29aa" exitCode=0 Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.105648 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"aba0a33b25941dc36369215b2f038d707c968205e863d2c57e5560b8ff5e29aa"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.105680 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ee8d25f4a8f7329ee2b859eb322cbcf2341d7fe58ad9552f31e54edbf2741129"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.105809 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106124 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106255 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106269 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106942 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106967 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.106977 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.108013 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.108043 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f508f099af64214f180eab37c022b90c06b79c474fcb03dc731666f85d8c2c45"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.109609 4565 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56" exitCode=0 Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.109643 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.109658 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3c4645663fa7b3220917511385323b8abe3b6112af92794df9656b0565811f67"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.109703 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.110186 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.110206 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.110215 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.111251 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc" exitCode=0 Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.111284 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.111308 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bc31655e8007965fb28341200d80419c4ab8e563d0aa9f7ecb0598145c849c07"} Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.111516 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.112316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.112340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.112368 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.115346 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.115891 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.115919 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.115994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: W1125 09:04:28.299972 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:28 crc kubenswrapper[4565]: E1125 09:04:28.300028 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:28 crc kubenswrapper[4565]: W1125 09:04:28.336426 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:28 crc kubenswrapper[4565]: E1125 09:04:28.336485 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:28 crc kubenswrapper[4565]: E1125 09:04:28.458910 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="1.6s" Nov 25 09:04:28 crc kubenswrapper[4565]: W1125 09:04:28.510136 4565 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.129:6443: connect: connection refused Nov 25 09:04:28 crc kubenswrapper[4565]: E1125 09:04:28.510207 4565 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.129:6443: connect: connection refused" logger="UnhandledError" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.640489 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.641661 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.641698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.641708 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:28 crc kubenswrapper[4565]: I1125 09:04:28.641727 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:28 crc kubenswrapper[4565]: E1125 09:04:28.642080 4565 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.129:6443: connect: connection refused" node="crc" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115157 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115187 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115197 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115207 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115215 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115289 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115898 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115956 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.115965 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117072 4565 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5" exitCode=0 Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117114 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117177 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117809 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.117838 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.118948 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ae40797698f4d237f5d8638e761b678cd5ae6be5716320228ac72ca65e2ae577"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.119004 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.119768 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.119781 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.119788 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.120783 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.120805 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.120814 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.120860 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.121366 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.121397 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.121405 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.123611 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.123631 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.123641 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a"} Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.123686 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.124263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.124293 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.124302 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:29 crc kubenswrapper[4565]: I1125 09:04:29.248424 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128529 4565 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1" exitCode=0 Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1"} Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128626 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128656 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128692 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.128695 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129638 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129671 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129689 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129640 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.129741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.242300 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.243446 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.243480 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.243490 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.243513 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.377433 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.377624 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.378328 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.378349 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:30 crc kubenswrapper[4565]: I1125 09:04:30.378357 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.135906 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38"} Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.136003 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f"} Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.136017 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b"} Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.136029 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7"} Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.136038 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0"} Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.136216 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.137478 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.137521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.137531 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.537676 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.537820 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.537866 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.538963 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.538993 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:31 crc kubenswrapper[4565]: I1125 09:04:31.539002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:32 crc kubenswrapper[4565]: I1125 09:04:32.992659 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 25 09:04:32 crc kubenswrapper[4565]: I1125 09:04:32.992806 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:32 crc kubenswrapper[4565]: I1125 09:04:32.993669 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:32 crc kubenswrapper[4565]: I1125 09:04:32.993697 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:32 crc kubenswrapper[4565]: I1125 09:04:32.993705 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:33 crc kubenswrapper[4565]: I1125 09:04:33.391452 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:33 crc kubenswrapper[4565]: I1125 09:04:33.391614 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:33 crc kubenswrapper[4565]: I1125 09:04:33.392631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:33 crc kubenswrapper[4565]: I1125 09:04:33.392664 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:33 crc kubenswrapper[4565]: I1125 09:04:33.392673 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:35 crc kubenswrapper[4565]: I1125 09:04:35.405908 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:35 crc kubenswrapper[4565]: I1125 09:04:35.406102 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:35 crc kubenswrapper[4565]: I1125 09:04:35.407075 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:35 crc kubenswrapper[4565]: I1125 09:04:35.407118 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:35 crc kubenswrapper[4565]: I1125 09:04:35.407128 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.099217 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.099402 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.100300 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.100332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.100341 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.392337 4565 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 09:04:36 crc kubenswrapper[4565]: I1125 09:04:36.392416 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 09:04:37 crc kubenswrapper[4565]: E1125 09:04:37.142562 4565 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.597467 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.597879 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.598763 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.598801 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.598810 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:37 crc kubenswrapper[4565]: I1125 09:04:37.601951 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.075100 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.149038 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.149659 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.149688 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.149698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:38 crc kubenswrapper[4565]: I1125 09:04:38.152787 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.054595 4565 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.146779 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.146914 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.147681 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.147724 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.147735 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.150164 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.150843 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.150866 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.150877 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.167723 4565 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.167770 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.173879 4565 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 09:04:39 crc kubenswrapper[4565]: I1125 09:04:39.173952 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 09:04:40 crc kubenswrapper[4565]: I1125 09:04:40.152053 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:40 crc kubenswrapper[4565]: I1125 09:04:40.152795 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:40 crc kubenswrapper[4565]: I1125 09:04:40.152834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:40 crc kubenswrapper[4565]: I1125 09:04:40.152843 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.542762 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.542916 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.543226 4565 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.543259 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.543688 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.543720 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.543729 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:41 crc kubenswrapper[4565]: I1125 09:04:41.546574 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156038 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156289 4565 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156337 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156754 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156802 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:42 crc kubenswrapper[4565]: I1125 09:04:42.156815 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:44 crc kubenswrapper[4565]: E1125 09:04:44.162723 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.164452 4565 trace.go:236] Trace[924745879]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 09:04:31.353) (total time: 12810ms): Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[924745879]: ---"Objects listed" error: 12810ms (09:04:44.164) Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[924745879]: [12.810870376s] [12.810870376s] END Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.164590 4565 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.165305 4565 trace.go:236] Trace[596845976]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 09:04:31.484) (total time: 12680ms): Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[596845976]: ---"Objects listed" error: 12680ms (09:04:44.165) Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[596845976]: [12.680742389s] [12.680742389s] END Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.165329 4565 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.166428 4565 trace.go:236] Trace[342382128]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 09:04:30.993) (total time: 13173ms): Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[342382128]: ---"Objects listed" error: 13173ms (09:04:44.166) Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[342382128]: [13.17326294s] [13.17326294s] END Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.166443 4565 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.167534 4565 trace.go:236] Trace[889713182]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 09:04:30.642) (total time: 13524ms): Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[889713182]: ---"Objects listed" error: 13524ms (09:04:44.167) Nov 25 09:04:44 crc kubenswrapper[4565]: Trace[889713182]: [13.524689447s] [13.524689447s] END Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.167760 4565 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 09:04:44 crc kubenswrapper[4565]: I1125 09:04:44.186673 4565 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 25 09:04:44 crc kubenswrapper[4565]: E1125 09:04:44.187260 4565 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.057035 4565 apiserver.go:52] "Watching apiserver" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059156 4565 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059383 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059711 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.059765 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059808 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059862 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.059874 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.060012 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.060056 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.060265 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.060306 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.062635 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.062636 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.063756 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.064071 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.067516 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.067696 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.067868 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.068016 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.069463 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.075752 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.083911 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.090549 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.097235 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.105653 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.112639 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.122105 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.132947 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.150944 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.155301 4565 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.156603 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.162642 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.162815 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.164819 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7" exitCode=255 Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.164868 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7"} Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.170617 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.176854 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.184451 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.191208 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192335 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192443 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192516 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192623 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192719 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192813 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192918 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193277 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193536 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.194150 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.194583 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192710 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192731 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192866 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.192879 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193022 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193099 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193162 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.193493 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.194110 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.194540 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.194787 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.195695 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.195966 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196531 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196572 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196589 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196607 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196629 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196650 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196665 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196679 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196692 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196706 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196719 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196733 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196746 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196759 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196772 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196787 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196802 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196816 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196831 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.196861 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197114 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197146 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197167 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197186 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197204 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197305 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197322 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197342 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197359 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197378 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197394 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197412 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197486 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197505 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197520 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197540 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197557 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197573 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197590 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197609 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197626 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197642 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197658 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197805 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197833 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197859 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197877 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197894 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.197909 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198033 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198060 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198078 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198101 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198121 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198140 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198161 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198276 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198297 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198312 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198334 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198357 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198376 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198393 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198412 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198430 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198450 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198474 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198493 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198510 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.195906 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198530 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198137 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198387 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198502 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198776 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198835 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.198999 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199051 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199175 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199310 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199425 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199505 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199529 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199615 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199670 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199799 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199849 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199812 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.199989 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200038 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200164 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200187 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200233 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200328 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200572 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.200708 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.201033 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.201192 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.201452 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.201580 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.201729 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.202075 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.202282 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.203275 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.203341 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.203785 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.203816 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.204047 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.204256 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.204413 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.204496 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.204672 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.205076 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.205262 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.205480 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.205876 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.206256 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.206777 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.207648 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.208628 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.209132 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.209433 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.209922 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210097 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210138 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210160 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210180 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210197 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210215 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210230 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210250 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210267 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210282 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210300 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210318 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210335 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210353 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210369 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210387 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210402 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210421 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210440 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210456 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210760 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.210476 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211181 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211212 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211232 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211237 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211247 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211271 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211302 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211310 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211321 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211369 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211407 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211437 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211444 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211504 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211524 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211588 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211610 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211627 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211639 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211645 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211660 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211692 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211694 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211715 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211735 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211754 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211782 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211802 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211827 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211856 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211880 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211911 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211961 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.211980 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212000 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212009 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212017 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212031 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212084 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212102 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212108 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212166 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212184 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212196 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212217 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212234 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212289 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212309 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212354 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212362 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212373 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212394 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212409 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212430 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212457 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214738 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214764 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214783 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212478 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212521 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212526 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212737 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212761 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.212883 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213012 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213089 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213230 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213260 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213279 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213460 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214248 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214284 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214299 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214305 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214498 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214524 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214546 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214666 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214707 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.214988 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.215855 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.215990 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.213252 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cvvlr"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.215878 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.215264 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.216294 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.216678 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.216694 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.217014 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.217214 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.217239 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.217589 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.218097 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.218524 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.218967 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.219297 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.219349 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.219727 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220175 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220200 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.215913 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220396 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220471 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220585 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220648 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220734 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220733 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220775 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220795 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220812 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220828 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220854 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.220869 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221063 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221144 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221365 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221524 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221780 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221879 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.221834 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222008 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222064 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222257 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222332 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222389 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222666 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222797 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.222904 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.223150 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.223491 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.223722 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.223834 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224065 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224067 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224218 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224203 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224468 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224786 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224800 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.224975 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.225211 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.225400 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.225614 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.225741 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.225857 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230130 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230157 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230172 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230189 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230203 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230216 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230229 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230244 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230257 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230270 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230286 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230300 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230335 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230350 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230370 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230384 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230397 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230410 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230424 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230437 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230452 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230467 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230481 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230494 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230508 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230524 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230539 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230552 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230565 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230580 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230652 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230669 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230685 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230699 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230720 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230735 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230754 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230768 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230782 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230797 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230831 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230862 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230879 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230896 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230910 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230937 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230955 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230974 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.230990 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231006 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231021 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231038 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231052 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231068 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231132 4565 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231142 4565 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231151 4565 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231160 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231168 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231176 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231185 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231193 4565 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231201 4565 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231209 4565 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231217 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231225 4565 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231233 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231241 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231248 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231256 4565 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231263 4565 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231271 4565 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231279 4565 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231286 4565 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231293 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231300 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231309 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231316 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231324 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231332 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231340 4565 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231347 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231355 4565 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231363 4565 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231371 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231379 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231388 4565 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231396 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231404 4565 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231412 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231420 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231427 4565 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231434 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231442 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231450 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231458 4565 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231465 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231473 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231480 4565 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231489 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231496 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231504 4565 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231512 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231520 4565 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231527 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231536 4565 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231543 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231551 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231558 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231566 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231574 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231581 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231588 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231596 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231603 4565 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231611 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231618 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231627 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231634 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231642 4565 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231649 4565 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231657 4565 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231664 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231671 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231678 4565 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231686 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231693 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231701 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231708 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231715 4565 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231723 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231731 4565 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231738 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231746 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231753 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231760 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231768 4565 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231775 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231783 4565 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231791 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231798 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231805 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231812 4565 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231819 4565 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231826 4565 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231834 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231851 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231859 4565 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231867 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231875 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231882 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231891 4565 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231899 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231907 4565 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231914 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231922 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231943 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231951 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231959 4565 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231966 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231974 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231982 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231990 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.231998 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232005 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232013 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232020 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232029 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232036 4565 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232044 4565 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232051 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232059 4565 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232066 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232074 4565 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232081 4565 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232090 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232097 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232107 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232115 4565 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232123 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232131 4565 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232138 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232146 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232153 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232160 4565 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232168 4565 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232175 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232182 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232189 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232196 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232204 4565 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232211 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232218 4565 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232226 4565 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232233 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232241 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232249 4565 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232257 4565 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232264 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232272 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232279 4565 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232287 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232294 4565 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232303 4565 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232310 4565 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.232317 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.233240 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.233787 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.234286 4565 scope.go:117] "RemoveContainer" containerID="de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.234726 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.234876 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.235026 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.235171 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.235361 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.235507 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.235817 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.236083 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.236288 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.236371 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:04:45.736355754 +0000 UTC m=+18.938850892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.236612 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.236756 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.237034 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.237140 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.237329 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.237506 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.238088 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.238559 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.239236 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.239946 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.244438 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.236619 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.245550 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.245722 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.246475 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.247225 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.247777 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.248111 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.250708 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.251299 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.251615 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.252336 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.252857 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.253312 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.254570 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:45.754555063 +0000 UTC m=+18.957050201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.254651 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253464 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253585 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253686 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253870 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253916 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.253381 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.254612 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.254370 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.255990 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.254751 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.253340 4565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.256261 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:45.756250155 +0000 UTC m=+18.958745292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.256649 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.263979 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.269940 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.269959 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.269968 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.270000 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:45.769990192 +0000 UTC m=+18.972485330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.270030 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.270038 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.270045 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.270071 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:45.770065304 +0000 UTC m=+18.972560442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.270354 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.271402 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.279235 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.280076 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.280532 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.283258 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.295975 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.304994 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.308364 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.330535 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333246 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9dbb18f9-1819-4221-9486-4d042cd042d7-hosts-file\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333273 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7n5\" (UniqueName: \"kubernetes.io/projected/9dbb18f9-1819-4221-9486-4d042cd042d7-kube-api-access-pn7n5\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333306 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333329 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333373 4565 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333383 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333392 4565 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333400 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333408 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333417 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333425 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333433 4565 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333440 4565 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333448 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333455 4565 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333463 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333470 4565 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333477 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333486 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333493 4565 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333500 4565 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333508 4565 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333516 4565 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333524 4565 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333531 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333538 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333546 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333553 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333560 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333567 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333574 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333580 4565 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333587 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333595 4565 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333603 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333610 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333617 4565 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333625 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333632 4565 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333641 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333648 4565 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333655 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333662 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333669 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333676 4565 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333684 4565 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333719 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.333814 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.350632 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.367347 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.371598 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.376669 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.383435 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.383553 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.394436 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.405145 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.425446 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.434054 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9dbb18f9-1819-4221-9486-4d042cd042d7-hosts-file\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.434085 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7n5\" (UniqueName: \"kubernetes.io/projected/9dbb18f9-1819-4221-9486-4d042cd042d7-kube-api-access-pn7n5\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.434258 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9dbb18f9-1819-4221-9486-4d042cd042d7-hosts-file\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.436297 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.444808 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.454902 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.462458 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7n5\" (UniqueName: \"kubernetes.io/projected/9dbb18f9-1819-4221-9486-4d042cd042d7-kube-api-access-pn7n5\") pod \"node-resolver-cvvlr\" (UID: \"9dbb18f9-1819-4221-9486-4d042cd042d7\") " pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.468558 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.547732 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cvvlr" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.595850 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-r28bt"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.596150 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.597320 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.597601 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.597743 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.597897 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.601541 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.604601 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.611298 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.618761 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.627528 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.639453 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.646232 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.655499 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.666526 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.701422 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.714858 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.737126 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.737190 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80bad26f-53b0-48f7-9ac4-110d3d8a475d-mcd-auth-proxy-config\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.737239 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97tj\" (UniqueName: \"kubernetes.io/projected/80bad26f-53b0-48f7-9ac4-110d3d8a475d-kube-api-access-q97tj\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.737285 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:04:46.737272256 +0000 UTC m=+19.939767395 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.737368 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80bad26f-53b0-48f7-9ac4-110d3d8a475d-proxy-tls\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.737412 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/80bad26f-53b0-48f7-9ac4-110d3d8a475d-rootfs\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838310 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80bad26f-53b0-48f7-9ac4-110d3d8a475d-mcd-auth-proxy-config\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838347 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838384 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q97tj\" (UniqueName: \"kubernetes.io/projected/80bad26f-53b0-48f7-9ac4-110d3d8a475d-kube-api-access-q97tj\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838401 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838416 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838434 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838456 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/80bad26f-53b0-48f7-9ac4-110d3d8a475d-rootfs\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838469 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80bad26f-53b0-48f7-9ac4-110d3d8a475d-proxy-tls\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.838984 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839015 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839026 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839030 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839068 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:46.839053518 +0000 UTC m=+20.041548655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839084 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:46.839077913 +0000 UTC m=+20.041573051 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839108 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839140 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:46.83912366 +0000 UTC m=+20.041618798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839190 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839201 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839211 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: E1125 09:04:45.839231 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:46.83922438 +0000 UTC m=+20.041719508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.838985 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80bad26f-53b0-48f7-9ac4-110d3d8a475d-mcd-auth-proxy-config\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.839290 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/80bad26f-53b0-48f7-9ac4-110d3d8a475d-rootfs\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.892973 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80bad26f-53b0-48f7-9ac4-110d3d8a475d-proxy-tls\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.892995 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97tj\" (UniqueName: \"kubernetes.io/projected/80bad26f-53b0-48f7-9ac4-110d3d8a475d-kube-api-access-q97tj\") pod \"machine-config-daemon-r28bt\" (UID: \"80bad26f-53b0-48f7-9ac4-110d3d8a475d\") " pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.904796 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:04:45 crc kubenswrapper[4565]: W1125 09:04:45.913584 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80bad26f_53b0_48f7_9ac4_110d3d8a475d.slice/crio-8896e4acedbd813b4b7b063403d78fde043f7bd2d71b6c55c8dd46d58a647610 WatchSource:0}: Error finding container 8896e4acedbd813b4b7b063403d78fde043f7bd2d71b6c55c8dd46d58a647610: Status 404 returned error can't find the container with id 8896e4acedbd813b4b7b063403d78fde043f7bd2d71b6c55c8dd46d58a647610 Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.961761 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vk74d"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.962425 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-pmkqf"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.962622 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.962800 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jpfp5"] Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.963011 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jpfp5" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.963122 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.964311 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.965208 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.965527 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.965660 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.965668 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966530 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966604 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966651 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966687 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966823 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966872 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.966970 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.967137 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.971205 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.983690 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:45Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:45 crc kubenswrapper[4565]: I1125 09:04:45.999267 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:45Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.010981 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.025001 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.036688 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.045674 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.057104 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.064774 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.073869 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.082304 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.095705 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.102455 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.113375 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.121220 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.136564 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140664 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-cnibin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140691 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-netns\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140707 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140723 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140739 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-system-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140778 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-os-release\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140812 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140837 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140864 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-cni-binary-copy\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140883 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-multus-certs\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140904 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140967 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-kubelet\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.140986 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-multus-daemon-config\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141000 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9cfr\" (UniqueName: \"kubernetes.io/projected/6d96c20a-2514-47cf-99ec-a314bacac513-kube-api-access-r9cfr\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141021 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141078 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141105 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141151 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141192 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141220 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141265 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-socket-dir-parent\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141283 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141310 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-os-release\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141327 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141350 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-etc-kubernetes\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141364 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-cnibin\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141379 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141410 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141442 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141465 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141480 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwbt\" (UniqueName: \"kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141502 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-bin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141522 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9kq\" (UniqueName: \"kubernetes.io/projected/28292a27-3521-4953-af83-48804d5ed947-kube-api-access-9j9kq\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141535 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141549 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-multus\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141563 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-hostroot\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141577 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141596 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141634 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-k8s-cni-cncf-io\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141649 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-conf-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141669 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-system-cni-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141692 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141706 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.141728 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.155207 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.168289 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"573c3b01824cfa62276bcd39d7110e288d892edb79c1f4ffa9691d95369db0bd"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.169802 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.169828 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.169839 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"303ea49d2cd6e6cbea6adb2ced2b1b7ac7029048c409883aebdf0134379a1548"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.171389 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.172736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.172942 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.173861 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cvvlr" event={"ID":"9dbb18f9-1819-4221-9486-4d042cd042d7","Type":"ContainerStarted","Data":"80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.173894 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cvvlr" event={"ID":"9dbb18f9-1819-4221-9486-4d042cd042d7","Type":"ContainerStarted","Data":"53f49ffd6188ec6af81d3e4b8a09122bdb969fbab0bc34a3259f71ecc34ba6eb"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.174772 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.174795 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b4689d508216d1d9b62688f8feb439761eec574704179f06fcd8335f74706a77"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.175981 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.176014 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.176026 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"8896e4acedbd813b4b7b063403d78fde043f7bd2d71b6c55c8dd46d58a647610"} Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.179536 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.182524 4565 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.237810 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242246 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242278 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242306 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-k8s-cni-cncf-io\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242323 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-conf-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242336 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-system-cni-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242353 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242365 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242376 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242403 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242420 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242421 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-cnibin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242454 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-netns\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242462 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-cnibin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242469 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242488 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-system-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242493 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-k8s-cni-cncf-io\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242502 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-os-release\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242524 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242539 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242553 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-cni-binary-copy\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242566 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-multus-certs\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242579 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9cfr\" (UniqueName: \"kubernetes.io/projected/6d96c20a-2514-47cf-99ec-a314bacac513-kube-api-access-r9cfr\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242591 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242614 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-kubelet\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242626 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-multus-daemon-config\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242658 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242672 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242689 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242706 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242719 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242731 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242726 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-netns\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242749 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242752 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-socket-dir-parent\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242777 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-conf-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242797 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-system-cni-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242819 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242836 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242872 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242921 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-system-cni-dir\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242964 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-multus-socket-dir-parent\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242984 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243008 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243024 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-os-release\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243045 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243071 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-os-release\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243116 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.242674 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-os-release\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243280 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243297 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-run-multus-certs\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243327 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-cni-binary-copy\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243360 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243377 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243391 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-kubelet\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243403 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243424 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243484 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243518 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-etc-kubernetes\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243550 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-cnibin\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243564 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243578 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243609 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243624 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243637 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwbt\" (UniqueName: \"kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243658 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243669 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-etc-kubernetes\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243679 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28292a27-3521-4953-af83-48804d5ed947-cnibin\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243793 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6d96c20a-2514-47cf-99ec-a314bacac513-multus-daemon-config\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243824 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.243991 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244121 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-bin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244210 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9kq\" (UniqueName: \"kubernetes.io/projected/28292a27-3521-4953-af83-48804d5ed947-kube-api-access-9j9kq\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244269 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-bin\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244253 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244336 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244391 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244397 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244441 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-multus\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244467 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-hostroot\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244483 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244520 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-host-var-lib-cni-multus\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244537 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6d96c20a-2514-47cf-99ec-a314bacac513-hostroot\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.244973 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.245332 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28292a27-3521-4953-af83-48804d5ed947-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.269118 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.293582 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.298977 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9cfr\" (UniqueName: \"kubernetes.io/projected/6d96c20a-2514-47cf-99ec-a314bacac513-kube-api-access-r9cfr\") pod \"multus-jpfp5\" (UID: \"6d96c20a-2514-47cf-99ec-a314bacac513\") " pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.307270 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwbt\" (UniqueName: \"kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt\") pod \"ovnkube-node-vk74d\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.327387 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9kq\" (UniqueName: \"kubernetes.io/projected/28292a27-3521-4953-af83-48804d5ed947-kube-api-access-9j9kq\") pod \"multus-additional-cni-plugins-pmkqf\" (UID: \"28292a27-3521-4953-af83-48804d5ed947\") " pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.361801 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.400982 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.441203 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.482185 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.522069 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.562052 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.574386 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.578910 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jpfp5" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.587959 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.607151 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.647006 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.685143 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.721064 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.751539 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.751700 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:04:48.751677814 +0000 UTC m=+21.954172952 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.778594 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.803327 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.841234 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.852726 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.852759 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.852786 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.852805 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852865 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852899 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852910 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852952 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:48.85291674 +0000 UTC m=+22.055411878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852956 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852968 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:48.852961855 +0000 UTC m=+22.055456994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852970 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852983 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.852995 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.853005 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.853011 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:48.853000448 +0000 UTC m=+22.055495586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:46 crc kubenswrapper[4565]: E1125 09:04:46.853033 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:48.853021558 +0000 UTC m=+22.055516695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.881784 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.924287 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:46 crc kubenswrapper[4565]: I1125 09:04:46.961357 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.001834 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.045538 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.097016 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.097055 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.097121 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.097028 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.097379 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.097457 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.112960 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.113525 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.114586 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.115352 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.122024 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.122562 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.131000 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.137198 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.137760 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.138773 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.139283 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.139765 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.140755 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.141475 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.142499 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.143210 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.145238 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.145779 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.146198 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.150280 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.150810 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.151282 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.155838 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.156508 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.156961 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.174455 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.174991 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.176341 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.177227 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.177940 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.180468 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.182310 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.187314 4565 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.187404 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.194794 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.195312 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.198387 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.199481 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.207697 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.210774 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.213085 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.213683 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.217489 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.218020 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.220319 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.220972 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.225545 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.226028 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.227632 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678" exitCode=0 Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.228638 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.231909 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.234246 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.235013 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.235997 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.236491 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.236992 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.238205 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.238763 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240126 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240525 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerStarted","Data":"018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240552 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerStarted","Data":"aa0668fe3ffb68a572ae658f7b21d5eb43ad5e016f7da258689db6e0e781ca60"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240565 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerStarted","Data":"a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240574 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerStarted","Data":"465ad648edd4a67f19d1b217d459754da70c1993b354ed69ec82540c35cfd3a3"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240582 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.240595 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"9a23e7275baf68cf079137414bbfda4c84774b130006cf9b18511cc0fa55d9d9"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.268951 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.283887 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.324788 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.364618 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.387541 4565 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.389179 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.389224 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.389238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.389532 4565 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.404113 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.458514 4565 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.458709 4565 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.459672 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.459696 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.459705 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.459721 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.459730 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.483909 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.490634 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.491275 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.491306 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.491316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.491343 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.491571 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.519667 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.525875 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.525910 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.525919 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.525947 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.525957 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.528698 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.540585 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.543568 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.543600 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.543610 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.543622 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.543630 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.553616 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.557767 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.557797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.557806 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.557820 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.557830 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.564200 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.569010 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: E1125 09:04:47.569119 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.570532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.570566 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.570577 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.570595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.570605 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.603580 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.642785 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.672731 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.672771 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.672783 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.672800 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.672811 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.684647 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.721990 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.762703 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.775340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.775379 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.775390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.775411 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.775420 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.803316 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.841155 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.876992 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.877033 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.877044 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.877059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.877068 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.882068 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.921617 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.961977 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.982072 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.982106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.982117 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.982132 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:47 crc kubenswrapper[4565]: I1125 09:04:47.982142 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:47Z","lastTransitionTime":"2025-11-25T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:47.999973 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.040869 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.083737 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.083766 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.083776 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.083788 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.083795 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.085119 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.121712 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.185650 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.185684 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.185695 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.185709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.185718 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.232356 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.233568 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b" exitCode=0 Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.233631 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238631 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238664 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238676 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238684 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238692 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.238702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.242439 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.252990 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.263480 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.281534 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.287817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.287877 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.287888 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.287901 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.287913 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.321497 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.367077 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.390204 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.390235 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.390245 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.390257 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.390267 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.401779 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.440588 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.483509 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.492430 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.492462 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.492472 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.492485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.492494 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.521813 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.561018 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.594272 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.594319 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.594330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.594346 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.594355 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.601428 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.642498 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.682274 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.698478 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.698507 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.698517 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.698530 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.698542 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.726876 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.767957 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.769177 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.769383 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:04:52.769356494 +0000 UTC m=+25.971851633 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.800522 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.800688 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.800716 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.800732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.800752 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.802419 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.842132 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.870185 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.870220 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.870250 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.870269 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870343 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870360 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870374 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870393 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870406 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870409 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870433 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870406 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:52.870394461 +0000 UTC m=+26.072889600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870444 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870460 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:52.870446369 +0000 UTC m=+26.072941507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870471 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:52.870465646 +0000 UTC m=+26.072960784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:48 crc kubenswrapper[4565]: E1125 09:04:48.870482 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:52.870477558 +0000 UTC m=+26.072972696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.881460 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.902709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.902743 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.902753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.902768 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.902776 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:48Z","lastTransitionTime":"2025-11-25T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.921505 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:48 crc kubenswrapper[4565]: I1125 09:04:48.962071 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.001902 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.005100 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.005132 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.005143 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.005156 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.005165 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.041001 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.085258 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.096155 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.096187 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.096152 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:49 crc kubenswrapper[4565]: E1125 09:04:49.096249 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:49 crc kubenswrapper[4565]: E1125 09:04:49.096294 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:49 crc kubenswrapper[4565]: E1125 09:04:49.096348 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.107004 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.107034 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.107043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.107056 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.107064 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.122866 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.163116 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.168409 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.177000 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.205087 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.209178 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.209207 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.209218 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.209232 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.209241 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.221899 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.243036 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253" exitCode=0 Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.243112 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.262837 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: E1125 09:04:49.280489 4565 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.310844 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.310873 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.310883 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.310894 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.310904 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.323098 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.362794 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.402415 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.412677 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.412711 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.412720 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.412735 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.412745 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.442693 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.482346 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.514904 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.514959 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.514972 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.514988 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.515000 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.525678 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.567617 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.602016 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dpgqk"] Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.602387 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.604079 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.615658 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.616332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.616365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.616378 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.616393 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.616412 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.636243 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.656062 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.676075 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.718330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.718361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.718370 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.718382 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.718391 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.725309 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.762983 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.777497 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj72n\" (UniqueName: \"kubernetes.io/projected/03764f22-c722-4de2-986b-9236cd9ef0af-kube-api-access-hj72n\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.777531 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03764f22-c722-4de2-986b-9236cd9ef0af-serviceca\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.777555 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03764f22-c722-4de2-986b-9236cd9ef0af-host\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.804029 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.820195 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.820227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.820236 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.820251 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.820263 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.842283 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.878820 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj72n\" (UniqueName: \"kubernetes.io/projected/03764f22-c722-4de2-986b-9236cd9ef0af-kube-api-access-hj72n\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.878873 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03764f22-c722-4de2-986b-9236cd9ef0af-serviceca\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.878900 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03764f22-c722-4de2-986b-9236cd9ef0af-host\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.878991 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03764f22-c722-4de2-986b-9236cd9ef0af-host\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.879786 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03764f22-c722-4de2-986b-9236cd9ef0af-serviceca\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.882178 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.914780 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj72n\" (UniqueName: \"kubernetes.io/projected/03764f22-c722-4de2-986b-9236cd9ef0af-kube-api-access-hj72n\") pod \"node-ca-dpgqk\" (UID: \"03764f22-c722-4de2-986b-9236cd9ef0af\") " pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.922065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.922099 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.922112 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.922125 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.922134 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:49Z","lastTransitionTime":"2025-11-25T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.942804 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:49 crc kubenswrapper[4565]: I1125 09:04:49.981984 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.021076 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.026285 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.026310 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.026319 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.026334 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.026342 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.063666 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.100710 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.128314 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.128345 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.128355 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.128367 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.128376 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.145951 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.181554 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.212507 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dpgqk" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.222559 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: W1125 09:04:50.223748 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03764f22_c722_4de2_986b_9236cd9ef0af.slice/crio-de02ed978175297f24dfde8c43b0268eea9a80028033723738383c6abc64f2f1 WatchSource:0}: Error finding container de02ed978175297f24dfde8c43b0268eea9a80028033723738383c6abc64f2f1: Status 404 returned error can't find the container with id de02ed978175297f24dfde8c43b0268eea9a80028033723738383c6abc64f2f1 Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.229979 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.230012 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.230022 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.230035 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.230043 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.247877 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c" exitCode=0 Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.247969 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.249852 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dpgqk" event={"ID":"03764f22-c722-4de2-986b-9236cd9ef0af","Type":"ContainerStarted","Data":"de02ed978175297f24dfde8c43b0268eea9a80028033723738383c6abc64f2f1"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.253287 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.261535 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.302667 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.333330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.333363 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.333373 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.333388 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.333399 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.344482 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.418372 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.432186 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.440433 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.440464 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.440480 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.440498 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.440509 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.467054 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.502169 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.542340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.542376 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.542386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.542401 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.542410 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.543970 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.580977 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.627450 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.645373 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.645421 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.645438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.645454 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.645480 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.662906 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.705173 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747298 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747558 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747670 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747743 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.747876 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.784023 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.822175 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.850512 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.850554 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.850564 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.850579 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.850589 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.865915 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.903323 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.942615 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.953282 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.953321 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.953332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.953350 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.953358 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:50Z","lastTransitionTime":"2025-11-25T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:50 crc kubenswrapper[4565]: I1125 09:04:50.982013 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.023746 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.055217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.055259 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.055272 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.055288 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.055303 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.097130 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.097187 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:51 crc kubenswrapper[4565]: E1125 09:04:51.097243 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.097319 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:51 crc kubenswrapper[4565]: E1125 09:04:51.097490 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:51 crc kubenswrapper[4565]: E1125 09:04:51.097648 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.157227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.157258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.157267 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.157280 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.157291 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.257238 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de" exitCode=0 Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.257319 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.258598 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.258615 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.258624 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.258634 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.258642 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.259207 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dpgqk" event={"ID":"03764f22-c722-4de2-986b-9236cd9ef0af","Type":"ContainerStarted","Data":"2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.267116 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.276506 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.293778 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.301797 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.316852 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.328024 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.339218 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.348081 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.360995 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.361027 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.361036 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.361052 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.361063 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.382922 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.422377 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.461738 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.463067 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.463098 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.463107 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.463121 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.463131 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.506070 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.541886 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.567274 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.567346 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.567364 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.567391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.567414 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.587444 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.623640 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.663720 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.669042 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.669141 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.669202 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.669269 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.669326 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.701669 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.744375 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.771326 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.771369 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.771379 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.771395 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.771403 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.785731 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.821803 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.862738 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.872876 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.872913 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.872924 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.872958 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.872968 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.905711 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.943315 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.975677 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.975709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.975718 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.975734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.975744 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:51Z","lastTransitionTime":"2025-11-25T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:51 crc kubenswrapper[4565]: I1125 09:04:51.980465 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.023564 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.063725 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.078173 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.078195 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.078203 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.078217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.078227 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.106180 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.139795 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.179519 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.179548 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.179557 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.179568 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.179577 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.183407 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.222366 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.263661 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975" exitCode=0 Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.263734 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.268139 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.268231 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.268363 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.276723 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.281283 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.281320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.281332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.281346 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.281355 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.293153 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.293328 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.304810 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.342483 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.381977 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.382836 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.382873 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.382885 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.382899 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.382908 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.424252 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.461069 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.485883 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.485923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.485997 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.486014 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.486024 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.506100 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.542763 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.581637 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.588504 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.588533 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.588544 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.588559 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.588569 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.621767 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.662998 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.690047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.690085 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.690096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.690114 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.690124 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.702007 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.747788 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.782388 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.791430 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.791456 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.791465 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.791479 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.791488 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.803195 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.803298 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.803278996 +0000 UTC m=+34.005774133 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.821187 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.860968 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.893496 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.893536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.893547 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.893564 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.893573 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.901327 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.904579 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.904616 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.904642 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.904662 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904727 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904766 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904793 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904799 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.904783323 +0000 UTC m=+34.107278461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904806 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904768 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904858 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904880 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904876 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904843 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.904831744 +0000 UTC m=+34.107326882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904921 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.904912586 +0000 UTC m=+34.107407724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:04:52 crc kubenswrapper[4565]: E1125 09:04:52.904972 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.904965326 +0000 UTC m=+34.107460464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.945222 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.979683 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.995491 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.995523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.995535 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.995553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:52 crc kubenswrapper[4565]: I1125 09:04:52.995564 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:52Z","lastTransitionTime":"2025-11-25T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.022370 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.060831 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.096188 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.096228 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.096269 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:53 crc kubenswrapper[4565]: E1125 09:04:53.096307 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:53 crc kubenswrapper[4565]: E1125 09:04:53.096366 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:53 crc kubenswrapper[4565]: E1125 09:04:53.096429 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.097299 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.097328 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.097339 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.097353 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.097362 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.102973 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.141834 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.182379 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.199222 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.199245 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.199253 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.199263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.199271 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.226066 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.262091 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.274476 4565 generic.go:334] "Generic (PLEG): container finished" podID="28292a27-3521-4953-af83-48804d5ed947" containerID="d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3" exitCode=0 Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.274555 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerDied","Data":"d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.274595 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.302432 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.302614 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.302667 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.302753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.302809 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.314369 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.350492 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.385122 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.405272 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.405310 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.405321 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.405336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.405346 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.424941 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.463613 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.503635 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.507798 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.507829 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.507839 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.507855 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.507864 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.546460 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.581791 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.610007 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.610038 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.610048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.610061 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.610068 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.622947 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.663528 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.703281 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.711775 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.711808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.711817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.711831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.711839 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.742658 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.784209 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.813637 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.813669 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.813678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.813691 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.813700 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.823816 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.861758 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.906260 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.916054 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.916081 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.916091 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.916102 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.916111 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:53Z","lastTransitionTime":"2025-11-25T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.944652 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:53 crc kubenswrapper[4565]: I1125 09:04:53.981060 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.018657 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.018690 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.018700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.018713 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.018723 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.023141 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.122566 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.122599 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.122610 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.122622 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.122632 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.224178 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.224213 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.224223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.224236 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.224247 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.280381 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" event={"ID":"28292a27-3521-4953-af83-48804d5ed947","Type":"ContainerStarted","Data":"2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.281716 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/0.log" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.284262 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371" exitCode=1 Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.284341 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.284835 4565 scope.go:117] "RemoveContainer" containerID="fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.294921 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.306903 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.315333 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.325532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.325624 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.325692 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.325750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.325807 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.332217 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.343693 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.354269 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.363995 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.378120 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.387047 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.420997 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.427312 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.427337 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.427347 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.427359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.427369 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.462536 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.502464 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.529323 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.529358 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.529367 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.529381 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.529396 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.551340 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.589157 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.624763 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.631046 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.631082 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.631091 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.631104 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.631113 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.661349 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.701604 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.733597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.733630 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.733638 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.733662 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.733672 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.743640 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.781071 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.822186 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.837452 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.837492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.837502 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.837515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.837523 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.862572 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.900872 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.939691 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.939719 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.939728 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.939741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.939749 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:54Z","lastTransitionTime":"2025-11-25T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.946921 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:54 crc kubenswrapper[4565]: I1125 09:04:54.981026 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.025325 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801855 5737 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801944 5737 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.802100 5737 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802304 5737 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802363 5737 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802398 5737 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802493 5737 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.041265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.041295 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.041304 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.041320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.041329 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.059887 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.096755 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:55 crc kubenswrapper[4565]: E1125 09:04:55.096841 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.097102 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:55 crc kubenswrapper[4565]: E1125 09:04:55.097154 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.097196 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:55 crc kubenswrapper[4565]: E1125 09:04:55.097234 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.102481 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.140629 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.142618 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.142654 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.142670 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.142691 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.142701 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.181775 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.221721 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.243959 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.243981 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.243989 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.244000 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.244008 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.290736 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/1.log" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.291140 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/0.log" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.292794 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" exitCode=1 Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.292819 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.292853 4565 scope.go:117] "RemoveContainer" containerID="fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.293305 4565 scope.go:117] "RemoveContainer" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" Nov 25 09:04:55 crc kubenswrapper[4565]: E1125 09:04:55.293429 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.302454 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.311343 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.340166 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.345342 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.345375 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.345384 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.345397 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.345406 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.384470 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801855 5737 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801944 5737 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.802100 5737 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802304 5737 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802363 5737 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802398 5737 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802493 5737 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.409102 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.421814 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.446586 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.446637 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.446648 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.446660 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.446671 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.461487 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.500574 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.541772 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.548049 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.548077 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.548086 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.548097 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.548106 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.581068 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.620996 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.649692 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.649718 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.649725 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.649756 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.649765 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.661402 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.700060 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.745657 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.751729 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.751752 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.751760 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.751771 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.751806 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.781216 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.820897 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.853468 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.853490 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.853498 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.853508 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.853515 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.860795 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.901661 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.941678 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.954834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.954861 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.954901 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.954914 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.954922 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:55Z","lastTransitionTime":"2025-11-25T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:55 crc kubenswrapper[4565]: I1125 09:04:55.981675 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:55Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.022037 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.056760 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.056783 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.056791 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.056803 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.056811 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.060707 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.100339 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.140792 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.158316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.158341 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.158352 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.158365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.158374 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.184217 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.221656 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.259838 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.259870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.259888 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.259901 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.259911 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.265391 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.296661 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/1.log" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.299198 4565 scope.go:117] "RemoveContainer" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" Nov 25 09:04:56 crc kubenswrapper[4565]: E1125 09:04:56.299377 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.300669 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.345293 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa072117cd29736444da6447e68520d4cd38fe1bdcefa94fe4348d942e386371\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801855 5737 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.801944 5737 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:53.802100 5737 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802304 5737 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802363 5737 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 09:04:53.802398 5737 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:53.802493 5737 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.361148 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.361175 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.361185 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.361198 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.361206 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.381518 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.421514 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.462187 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.463668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.463709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.463721 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.463741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.463750 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.500691 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.541950 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.566157 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.566181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.566189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.566198 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.566206 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.582826 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.623567 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.665852 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.667878 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.667945 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.667957 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.667971 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.667981 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.702029 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.740693 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.769499 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.769524 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.769533 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.769547 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.769556 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.780751 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.822076 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.860566 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.871785 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.871814 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.871823 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.871835 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.871844 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.902693 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.941220 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.973868 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.973901 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.973910 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.973923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.973950 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:56Z","lastTransitionTime":"2025-11-25T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:56 crc kubenswrapper[4565]: I1125 09:04:56.984025 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.020014 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.075804 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.075831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.075839 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.075920 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.075957 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.096956 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.096957 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.097037 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.097063 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.097116 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.097158 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.106193 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.115151 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.142208 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.177257 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.177306 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.177317 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.177329 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.177338 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.182499 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.223673 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.261840 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.278686 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.278721 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.278730 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.278744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.278753 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.302012 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.341317 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.380744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.380769 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.380778 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.380791 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.380799 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.386650 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.421624 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.440876 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl"] Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.441252 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.464168 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.475258 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.482268 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.482312 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.482322 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.482336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.482345 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.495497 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.540229 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be22f021-a051-4111-ba40-782e0c85f8b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.540256 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.540272 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8pj\" (UniqueName: \"kubernetes.io/projected/be22f021-a051-4111-ba40-782e0c85f8b5-kube-api-access-7m8pj\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.540337 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.540716 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.581610 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.583694 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.583726 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.583737 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.583750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.583760 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.620856 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.641314 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.641365 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8pj\" (UniqueName: \"kubernetes.io/projected/be22f021-a051-4111-ba40-782e0c85f8b5-kube-api-access-7m8pj\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.641401 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.641533 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be22f021-a051-4111-ba40-782e0c85f8b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.641974 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.642010 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be22f021-a051-4111-ba40-782e0c85f8b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.645446 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be22f021-a051-4111-ba40-782e0c85f8b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.665696 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.685419 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.685449 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.685457 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.685468 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.685474 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.687206 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8pj\" (UniqueName: \"kubernetes.io/projected/be22f021-a051-4111-ba40-782e0c85f8b5-kube-api-access-7m8pj\") pod \"ovnkube-control-plane-749d76644c-dd8cl\" (UID: \"be22f021-a051-4111-ba40-782e0c85f8b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.721729 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.730501 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.730534 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.730543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.730555 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.730564 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.739109 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.741710 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.741797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.741862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.741954 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.742010 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.749800 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.751248 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.753263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.753286 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.753294 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.753304 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.753313 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: W1125 09:04:57.760656 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe22f021_a051_4111_ba40_782e0c85f8b5.slice/crio-45f8da8b0fe1417124f2e66c2e256f77613fddd339dd376290ba227e9eb5cdd4 WatchSource:0}: Error finding container 45f8da8b0fe1417124f2e66c2e256f77613fddd339dd376290ba227e9eb5cdd4: Status 404 returned error can't find the container with id 45f8da8b0fe1417124f2e66c2e256f77613fddd339dd376290ba227e9eb5cdd4 Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.762875 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765184 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765836 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765846 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765859 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.765868 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.775544 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.778106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.778126 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.778136 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.778144 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.778152 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.788808 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: E1125 09:04:57.788949 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.789961 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.789983 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.789993 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.790005 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.790013 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.801534 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.841099 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.879722 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.891804 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.891832 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.891842 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.891854 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.891862 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.926506 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.961910 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.994592 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.994624 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.994632 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.994644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:57 crc kubenswrapper[4565]: I1125 09:04:57.994654 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:57Z","lastTransitionTime":"2025-11-25T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.006360 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.041822 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.081349 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.097154 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.097189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.097198 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.097212 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.097222 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.121004 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.161259 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.199570 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.199592 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.199600 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.199613 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.199622 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.204326 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.239974 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.282290 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.301566 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.301595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.301604 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.301619 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.301628 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.303514 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" event={"ID":"be22f021-a051-4111-ba40-782e0c85f8b5","Type":"ContainerStarted","Data":"0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.303548 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" event={"ID":"be22f021-a051-4111-ba40-782e0c85f8b5","Type":"ContainerStarted","Data":"114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.303558 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" event={"ID":"be22f021-a051-4111-ba40-782e0c85f8b5","Type":"ContainerStarted","Data":"45f8da8b0fe1417124f2e66c2e256f77613fddd339dd376290ba227e9eb5cdd4"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.323685 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.362619 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.400516 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.403712 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.403742 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.403750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.403764 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.403772 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.441904 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.482177 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.493266 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.493771 4565 scope.go:117] "RemoveContainer" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" Nov 25 09:04:58 crc kubenswrapper[4565]: E1125 09:04:58.493906 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.506118 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.506146 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.506156 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.506168 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.506176 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.520883 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.560981 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.601879 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.608324 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.608358 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.608367 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.608380 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.608387 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.640074 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.685428 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.712389 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.712415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.712425 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.712437 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.712447 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.723200 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.761784 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.801829 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.814283 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.814324 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.814336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.814352 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.814361 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.842014 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.859499 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fzpzk"] Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.859948 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:58 crc kubenswrapper[4565]: E1125 09:04:58.860006 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.881913 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.916443 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.916474 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.916483 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.916496 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.916505 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:58Z","lastTransitionTime":"2025-11-25T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.921966 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:58 crc kubenswrapper[4565]: I1125 09:04:58.965777 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:58Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.002358 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.017704 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.017734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.017759 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.017776 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.017784 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.041415 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.051950 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.051993 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4f64\" (UniqueName: \"kubernetes.io/projected/b5b047b2-31c7-45e7-a944-8d9c6de61061-kube-api-access-d4f64\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.080325 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.096153 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.096166 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.096167 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.096235 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.096307 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.096357 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.119916 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.119966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.119976 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.119991 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.119999 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.121442 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.153016 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.153065 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4f64\" (UniqueName: \"kubernetes.io/projected/b5b047b2-31c7-45e7-a944-8d9c6de61061-kube-api-access-d4f64\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.153105 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.153155 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:04:59.653141232 +0000 UTC m=+32.855636369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.161827 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.191598 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4f64\" (UniqueName: \"kubernetes.io/projected/b5b047b2-31c7-45e7-a944-8d9c6de61061-kube-api-access-d4f64\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.221477 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.221505 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.221513 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.221534 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.221542 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.225616 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.261125 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.300241 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.323101 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.323130 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.323142 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.323158 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.323168 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.341395 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.382451 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.419855 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.425142 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.425169 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.425179 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.425191 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.425199 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.460975 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.501462 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.527414 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.527451 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.527461 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.527475 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.527487 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.545550 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.581128 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.620595 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.629833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.629863 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.629870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.629881 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.629890 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.655249 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.655368 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:04:59 crc kubenswrapper[4565]: E1125 09:04:59.655408 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:00.655398404 +0000 UTC m=+33.857893542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.660629 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.731521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.731552 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.731561 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.731576 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.731585 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.833797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.833884 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.833966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.834023 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.834137 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.936427 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.936453 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.936461 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.936471 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:04:59 crc kubenswrapper[4565]: I1125 09:04:59.936477 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:04:59Z","lastTransitionTime":"2025-11-25T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.038358 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.038386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.038394 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.038405 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.038413 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.140616 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.140715 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.140780 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.140840 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.140916 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.242317 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.242343 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.242353 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.242365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.242373 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.343865 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.343909 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.343918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.343947 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.343955 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.445952 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.446041 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.446110 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.446178 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.446240 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.548425 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.548536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.548601 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.548673 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.548731 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.649861 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.649887 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.649911 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.649924 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.649952 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.661440 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.661628 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.661705 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:02.661688375 +0000 UTC m=+35.864183523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.751741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.751816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.751827 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.751842 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.751852 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.854021 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.854071 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.854080 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.854095 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.854103 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.863366 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.863478 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:05:16.863455274 +0000 UTC m=+50.065950463 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.955615 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.955665 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.955676 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.955685 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.955695 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:00Z","lastTransitionTime":"2025-11-25T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.964051 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.964081 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.964107 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:00 crc kubenswrapper[4565]: I1125 09:05:00.964126 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964157 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964170 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964186 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964197 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964207 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964209 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964220 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:16.964205398 +0000 UTC m=+50.166700535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964178 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964249 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964237 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:16.96422823 +0000 UTC m=+50.166723368 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964301 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:16.96428058 +0000 UTC m=+50.166775728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:00 crc kubenswrapper[4565]: E1125 09:05:00.964318 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:16.964309013 +0000 UTC m=+50.166804151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.057436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.057473 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.057481 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.057492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.057501 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.096328 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.096352 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.096377 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:01 crc kubenswrapper[4565]: E1125 09:05:01.096746 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:01 crc kubenswrapper[4565]: E1125 09:05:01.096624 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:01 crc kubenswrapper[4565]: E1125 09:05:01.096811 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.096460 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:01 crc kubenswrapper[4565]: E1125 09:05:01.096966 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.158879 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.158917 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.158944 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.158957 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.158966 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.260554 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.260595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.260603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.260612 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.260621 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.362305 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.362336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.362345 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.362358 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.362366 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.463945 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.463975 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.463984 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.463994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.464006 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.566134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.566170 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.566181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.566196 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.566205 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.668292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.668320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.668328 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.668337 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.668344 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.770050 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.770081 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.770090 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.770103 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.770111 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.872435 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.872487 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.872498 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.872514 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.872528 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.974459 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.974491 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.974500 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.974512 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:01 crc kubenswrapper[4565]: I1125 09:05:01.974520 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:01Z","lastTransitionTime":"2025-11-25T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.075548 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.075595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.075605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.075614 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.075622 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.177031 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.177056 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.177065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.177076 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.177084 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.278310 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.278340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.278348 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.278359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.278368 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.379827 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.380311 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.380388 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.380467 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.380523 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.482522 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.482543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.482552 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.482561 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.482568 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.584633 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.584659 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.584666 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.584677 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.584685 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.676836 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:02 crc kubenswrapper[4565]: E1125 09:05:02.676940 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:02 crc kubenswrapper[4565]: E1125 09:05:02.676981 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:06.676967584 +0000 UTC m=+39.879462722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.686202 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.686237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.686246 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.686259 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.686267 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.787653 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.787683 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.787693 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.787705 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.787713 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.889113 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.889432 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.889499 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.889558 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.889615 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.991321 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.991393 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.991409 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.991426 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:02 crc kubenswrapper[4565]: I1125 09:05:02.991467 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:02Z","lastTransitionTime":"2025-11-25T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.092819 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.092853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.092862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.092874 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.092884 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.097232 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.097257 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.097261 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:03 crc kubenswrapper[4565]: E1125 09:05:03.097326 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:03 crc kubenswrapper[4565]: E1125 09:05:03.097391 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:03 crc kubenswrapper[4565]: E1125 09:05:03.097474 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.097604 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:03 crc kubenswrapper[4565]: E1125 09:05:03.097769 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.194885 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.194924 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.194951 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.194962 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.194969 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.296717 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.296745 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.296755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.296768 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.296776 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.398162 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.398195 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.398219 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.398231 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.398239 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.499795 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.499822 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.499831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.499840 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.499848 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.601530 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.601571 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.601582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.601596 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.601606 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.703593 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.703617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.703625 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.703634 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.703641 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.805216 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.805378 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.805539 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.805554 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.805564 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.907183 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.907232 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.907241 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.907251 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:03 crc kubenswrapper[4565]: I1125 09:05:03.907258 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:03Z","lastTransitionTime":"2025-11-25T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.010080 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.010111 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.010120 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.010133 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.010142 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.111940 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.111978 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.111987 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.111997 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.112004 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.214104 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.214149 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.214159 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.214170 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.214179 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.316217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.316244 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.316256 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.316265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.316273 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.418097 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.418121 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.418130 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.418139 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.418147 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.519970 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.520005 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.520068 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.520080 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.520091 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.621837 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.621863 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.621871 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.621880 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.621887 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.723395 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.723417 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.723426 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.723451 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.723460 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.825256 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.825294 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.825303 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.825318 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.825329 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.926563 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.926589 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.926597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.926607 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:04 crc kubenswrapper[4565]: I1125 09:05:04.926615 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:04Z","lastTransitionTime":"2025-11-25T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.028002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.028029 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.028038 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.028048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.028055 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.097014 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.097051 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:05 crc kubenswrapper[4565]: E1125 09:05:05.097092 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:05 crc kubenswrapper[4565]: E1125 09:05:05.097139 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.097060 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.097171 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:05 crc kubenswrapper[4565]: E1125 09:05:05.097214 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:05 crc kubenswrapper[4565]: E1125 09:05:05.097277 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.129456 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.129515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.129525 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.129536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.129544 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.231084 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.231110 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.231118 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.231129 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.231137 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.333143 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.333167 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.333175 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.333186 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.333195 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.434650 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.434704 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.434714 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.434727 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.434735 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.536883 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.536926 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.536952 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.536964 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.536972 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.638750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.638772 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.638781 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.638789 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.638797 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.740840 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.740870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.740878 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.740890 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.740897 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.842648 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.842675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.842684 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.842693 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.842701 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.944458 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.944483 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.944492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.944500 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:05 crc kubenswrapper[4565]: I1125 09:05:05.944508 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:05Z","lastTransitionTime":"2025-11-25T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.045958 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.045991 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.046000 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.046012 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.046020 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.147612 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.147648 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.147658 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.147674 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.147684 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.249629 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.249663 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.249671 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.249683 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.249692 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.351587 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.351640 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.351650 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.351662 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.351671 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.453525 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.453560 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.453569 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.453580 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.453589 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.555565 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.555593 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.555603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.555647 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.555658 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.657709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.657755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.657765 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.657785 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.657793 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.711550 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:06 crc kubenswrapper[4565]: E1125 09:05:06.711661 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:06 crc kubenswrapper[4565]: E1125 09:05:06.711704 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:14.711693444 +0000 UTC m=+47.914188582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.759706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.759735 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.759744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.759755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.759763 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.861553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.861585 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.861595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.861608 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.861615 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.963374 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.963403 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.963411 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.963422 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:06 crc kubenswrapper[4565]: I1125 09:05:06.963429 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:06Z","lastTransitionTime":"2025-11-25T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.065379 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.065408 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.065417 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.065430 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.065442 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.097023 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.097059 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.097199 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.097304 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:07 crc kubenswrapper[4565]: E1125 09:05:07.097420 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:07 crc kubenswrapper[4565]: E1125 09:05:07.097562 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:07 crc kubenswrapper[4565]: E1125 09:05:07.097679 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:07 crc kubenswrapper[4565]: E1125 09:05:07.097718 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.106425 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.114600 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.124293 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.132884 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.140152 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.153180 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.161450 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.166742 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.166773 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.166783 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.166795 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.166804 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.168434 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.181041 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.190686 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.202597 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.209035 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.215709 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.222558 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.230779 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.238671 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.248293 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.268001 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.268024 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.268032 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.268043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.268053 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.370148 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.370181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.370191 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.370203 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.370212 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.471803 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.471837 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.471847 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.471860 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.471871 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.573994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.574028 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.574052 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.574066 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.574075 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.675948 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.675976 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.675984 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.675996 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.676004 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.777673 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.777704 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.777713 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.777724 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.777732 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.879237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.879265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.879273 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.879284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.879292 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.980954 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.980985 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.980993 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.981006 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:07 crc kubenswrapper[4565]: I1125 09:05:07.981014 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:07Z","lastTransitionTime":"2025-11-25T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.082964 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.082996 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.083008 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.083018 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.083024 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.094840 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.094871 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.094881 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.094894 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.094902 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.104168 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:08Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.106544 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.106597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.106607 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.106618 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.106626 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.114607 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:08Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.117845 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.117879 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.117889 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.117902 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.117914 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.127235 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:08Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.129489 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.129519 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.129527 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.129537 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.129544 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.137490 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:08Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.139486 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.139540 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.139549 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.139559 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.139567 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.147429 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:08Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:08 crc kubenswrapper[4565]: E1125 09:05:08.147598 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.184923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.184974 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.184982 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.184994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.185003 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.287297 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.287324 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.287332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.287344 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.287352 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.389391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.389416 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.389425 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.389436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.389444 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.491271 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.491306 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.491315 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.491327 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.491336 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.593043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.593109 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.593117 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.593129 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.593137 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.695557 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.695591 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.695601 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.695631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.695650 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.797753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.797800 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.797811 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.797820 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.797827 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.899300 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.899331 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.899339 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.899351 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:08 crc kubenswrapper[4565]: I1125 09:05:08.899359 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:08Z","lastTransitionTime":"2025-11-25T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.001516 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.001551 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.001561 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.001574 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.001583 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.096538 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.096598 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.096601 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.096607 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:09 crc kubenswrapper[4565]: E1125 09:05:09.096818 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:09 crc kubenswrapper[4565]: E1125 09:05:09.097151 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:09 crc kubenswrapper[4565]: E1125 09:05:09.097222 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.097309 4565 scope.go:117] "RemoveContainer" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" Nov 25 09:05:09 crc kubenswrapper[4565]: E1125 09:05:09.097507 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.103245 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.103332 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.103343 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.103356 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.103365 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.205463 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.205492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.205501 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.205514 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.205531 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.307595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.307623 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.307632 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.307645 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.307654 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.327767 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/1.log" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.329598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.329999 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.351596 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.363323 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.371392 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.380183 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.388153 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.395587 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.404234 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.409864 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.409895 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.409904 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.409918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.409952 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.412263 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.419717 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.428450 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.444867 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.458020 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.466078 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.474448 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.481417 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.493442 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.502120 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.511159 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.511188 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.511197 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.511237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.511246 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.613107 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.613135 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.613142 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.613154 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.613161 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.715392 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.715428 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.715437 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.715463 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.715472 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.816705 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.816734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.816743 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.816756 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.816764 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.918139 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.918165 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.918173 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.918184 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:09 crc kubenswrapper[4565]: I1125 09:05:09.918193 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:09Z","lastTransitionTime":"2025-11-25T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.020025 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.020050 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.020059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.020069 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.020077 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.122078 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.122114 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.122122 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.122134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.122142 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.223751 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.223790 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.223800 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.223814 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.223825 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.325712 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.325755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.325765 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.325787 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.325797 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.333037 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/2.log" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.333489 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/1.log" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.335443 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" exitCode=1 Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.335473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.335500 4565 scope.go:117] "RemoveContainer" containerID="656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.335985 4565 scope.go:117] "RemoveContainer" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" Nov 25 09:05:10 crc kubenswrapper[4565]: E1125 09:05:10.336108 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.349031 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.357263 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.364711 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.372813 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.382748 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.389677 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.405703 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.414288 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.421427 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.428146 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.428170 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.428179 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.428192 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.428200 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.429816 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.438055 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.445323 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.453198 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.459828 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.471398 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://656c9ba095910816cc6c54327731bd1cff6228d124fe1bd8cb959c582a016896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"message\\\":\\\"4:54.845681 5883 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 09:04:54.845732 5883 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.846028 5883 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 09:04:54.847559 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 09:04:54.847583 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 09:04:54.848181 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 09:04:54.848211 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 09:04:54.849240 5883 factory.go:656] Stopping watch factory\\\\nI1125 09:04:54.885040 5883 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 09:04:54.885060 5883 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 09:04:54.885094 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1125 09:04:54.885116 5883 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 09:04:54.885159 5883 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.477565 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.484736 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.530171 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.530214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.530223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.530234 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.530242 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.631909 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.631965 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.631974 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.631986 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.631995 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.733727 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.733755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.733764 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.733774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.733782 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.835411 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.835433 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.835440 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.835450 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.835458 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.936649 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.936674 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.936699 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.936709 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:10 crc kubenswrapper[4565]: I1125 09:05:10.936716 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:10Z","lastTransitionTime":"2025-11-25T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.038651 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.038725 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.038737 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.038753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.038763 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.096392 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.096491 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.096461 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.096427 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:11 crc kubenswrapper[4565]: E1125 09:05:11.096709 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:11 crc kubenswrapper[4565]: E1125 09:05:11.096787 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:11 crc kubenswrapper[4565]: E1125 09:05:11.096962 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:11 crc kubenswrapper[4565]: E1125 09:05:11.097030 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.140592 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.140616 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.140624 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.140633 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.140641 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.242687 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.242717 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.242727 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.242739 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.242749 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.338278 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/2.log" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.340732 4565 scope.go:117] "RemoveContainer" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" Nov 25 09:05:11 crc kubenswrapper[4565]: E1125 09:05:11.340846 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.344340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.344426 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.344488 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.344542 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.344607 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.350513 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.357573 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.369100 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.375324 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.382658 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.389500 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.398010 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.406566 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.417983 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.426467 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.435754 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.446620 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.446653 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.446662 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.446673 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.446681 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.450478 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.459273 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.466500 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.475097 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.483029 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.489425 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.548520 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.548546 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.548570 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.548584 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.548592 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.649817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.649862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.649873 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.649889 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.649900 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.751556 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.751593 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.751602 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.751616 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.751627 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.853109 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.853159 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.853169 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.853181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.853190 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.954899 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.954959 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.954969 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.954985 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:11 crc kubenswrapper[4565]: I1125 09:05:11.954995 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:11Z","lastTransitionTime":"2025-11-25T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.056380 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.056414 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.056422 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.056437 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.056447 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.158155 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.158189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.158198 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.158214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.158223 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.260262 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.260289 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.260313 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.260325 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.260332 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.361601 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.361635 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.361644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.361658 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.361668 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.463518 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.463547 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.463575 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.463585 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.463592 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.565536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.565566 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.565574 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.565586 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.565595 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.666778 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.666918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.667029 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.667089 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.667146 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.769088 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.769210 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.769274 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.769330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.769384 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.871052 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.871117 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.871128 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.871141 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.871150 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.972685 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.972716 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.972725 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.972736 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:12 crc kubenswrapper[4565]: I1125 09:05:12.972744 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:12Z","lastTransitionTime":"2025-11-25T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.074204 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.074233 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.074243 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.074258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.074266 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.096853 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.096907 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:13 crc kubenswrapper[4565]: E1125 09:05:13.096956 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.096979 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:13 crc kubenswrapper[4565]: E1125 09:05:13.097063 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.097091 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:13 crc kubenswrapper[4565]: E1125 09:05:13.097144 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:13 crc kubenswrapper[4565]: E1125 09:05:13.097204 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.175965 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.175992 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.176002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.176013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.176021 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.277900 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.277970 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.277981 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.277996 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.278006 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.379455 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.379490 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.379498 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.379511 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.379519 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.481359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.481415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.481423 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.481436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.481461 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.583562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.583594 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.583603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.583615 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.583624 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.685859 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.685896 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.685904 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.685917 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.685940 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.788035 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.788077 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.788087 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.788101 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.788110 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.889860 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.889882 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.889890 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.889900 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.889908 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.991414 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.991452 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.991461 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.991471 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:13 crc kubenswrapper[4565]: I1125 09:05:13.991478 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:13Z","lastTransitionTime":"2025-11-25T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.092686 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.092711 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.092720 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.092732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.092740 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.194137 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.194174 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.194184 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.194214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.194225 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.296062 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.296084 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.296094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.296105 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.296112 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.398030 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.398056 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.398065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.398078 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.398086 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.499236 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.499265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.499273 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.499284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.499293 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.601375 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.601409 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.601419 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.601433 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.601441 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.702923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.702977 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.702987 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.702999 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.703007 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.773767 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:14 crc kubenswrapper[4565]: E1125 09:05:14.773864 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:14 crc kubenswrapper[4565]: E1125 09:05:14.773924 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:30.773910059 +0000 UTC m=+63.976405197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.804706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.804725 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.804733 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.804745 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.804752 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.906508 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.906539 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.906549 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.906560 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:14 crc kubenswrapper[4565]: I1125 09:05:14.906571 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:14Z","lastTransitionTime":"2025-11-25T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.008489 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.008527 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.008537 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.008559 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.008568 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.096857 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.096947 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:15 crc kubenswrapper[4565]: E1125 09:05:15.097032 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.097250 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.097277 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:15 crc kubenswrapper[4565]: E1125 09:05:15.097325 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:15 crc kubenswrapper[4565]: E1125 09:05:15.097370 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:15 crc kubenswrapper[4565]: E1125 09:05:15.097414 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.110397 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.110422 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.110431 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.110442 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.110451 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.214361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.214415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.214427 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.214448 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.214459 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.316119 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.316144 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.316152 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.316163 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.316172 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.417534 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.417562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.417572 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.417583 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.417590 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.519147 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.519289 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.519361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.519423 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.519483 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.620780 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.620808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.620817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.620828 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.620835 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.722339 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.722365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.722375 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.722385 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.722393 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.824316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.824422 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.824495 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.824559 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.824617 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.925880 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.925910 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.925920 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.925947 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:15 crc kubenswrapper[4565]: I1125 09:05:15.925963 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:15Z","lastTransitionTime":"2025-11-25T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.027351 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.027375 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.027383 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.027392 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.027401 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.129003 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.129028 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.129036 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.129047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.129054 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.231033 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.231065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.231077 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.231091 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.231101 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.332657 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.332755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.332812 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.332888 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.332985 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.434605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.434630 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.434637 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.434648 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.434657 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.535571 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.535625 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.535635 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.535647 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.535656 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.636971 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.637004 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.637013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.637026 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.637034 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.738655 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.738687 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.738696 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.738708 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.738717 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.840331 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.840352 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.840360 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.840371 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.840378 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.889006 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.889164 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:05:48.889151143 +0000 UTC m=+82.091646271 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.942003 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.942034 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.942046 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.942057 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.942065 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:16Z","lastTransitionTime":"2025-11-25T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.989885 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.989946 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.989988 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:16 crc kubenswrapper[4565]: I1125 09:05:16.990007 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990064 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990082 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990110 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:48.990096184 +0000 UTC m=+82.192591322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990128 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:48.990119738 +0000 UTC m=+82.192614875 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990151 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990161 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990171 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990197 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:48.990186584 +0000 UTC m=+82.192681722 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990232 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990242 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990248 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:16 crc kubenswrapper[4565]: E1125 09:05:16.990265 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:05:48.990259803 +0000 UTC m=+82.192754940 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.043390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.043416 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.043424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.043436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.043446 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.096427 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.096455 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.096473 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.096492 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:17 crc kubenswrapper[4565]: E1125 09:05:17.096580 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:17 crc kubenswrapper[4565]: E1125 09:05:17.096649 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:17 crc kubenswrapper[4565]: E1125 09:05:17.096812 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:17 crc kubenswrapper[4565]: E1125 09:05:17.096862 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.107526 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.116133 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.123237 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.130312 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.137893 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.147456 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.147631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.147647 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.147658 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.147667 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.150477 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.163956 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.171424 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.178204 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.184247 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.190752 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.196982 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.204123 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.210774 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.222619 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.230647 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.238706 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.248783 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.248806 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.248815 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.248825 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.248857 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.350223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.350254 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.350263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.350277 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.350285 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.451522 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.451553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.451561 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.451574 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.451584 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.553251 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.553277 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.553286 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.553296 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.553304 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.654585 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.654613 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.654620 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.654631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.654641 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.755800 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.755829 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.755836 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.755846 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.755853 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.857784 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.857813 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.857821 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.857831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.857840 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.959233 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.959259 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.959268 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.959278 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:17 crc kubenswrapper[4565]: I1125 09:05:17.959284 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:17Z","lastTransitionTime":"2025-11-25T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.060372 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.060404 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.060412 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.060424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.060432 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.161787 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.161828 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.161839 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.161853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.161863 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.262880 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.262913 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.262921 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.262980 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.262989 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.349281 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.349315 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.349325 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.349336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.349345 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.357484 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:18Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.359420 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.359446 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.359455 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.359465 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.359473 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.366995 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:18Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.368952 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.368982 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.368991 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.369000 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.369007 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.376293 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:18Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.378075 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.378094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.378102 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.378111 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.378117 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.385540 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:18Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.387315 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.387339 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.387347 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.387356 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.387364 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.394498 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:18Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:18 crc kubenswrapper[4565]: E1125 09:05:18.394598 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.395286 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.395307 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.395314 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.395324 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.395331 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.496945 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.496984 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.496993 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.497002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.497009 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.598748 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.598776 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.598784 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.598796 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.598804 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.700211 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.700238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.700247 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.700258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.700265 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.802504 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.802598 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.802682 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.802740 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.802789 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.904230 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.904270 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.904281 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.904297 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:18 crc kubenswrapper[4565]: I1125 09:05:18.904307 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:18Z","lastTransitionTime":"2025-11-25T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.005732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.005758 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.005766 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.005775 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.005784 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.097151 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.097187 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:19 crc kubenswrapper[4565]: E1125 09:05:19.097232 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.097290 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.097321 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:19 crc kubenswrapper[4565]: E1125 09:05:19.097349 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:19 crc kubenswrapper[4565]: E1125 09:05:19.097393 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:19 crc kubenswrapper[4565]: E1125 09:05:19.097452 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.107142 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.107164 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.107172 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.107181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.107187 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.209567 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.209590 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.209597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.209606 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.209612 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.310764 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.310787 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.310795 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.310803 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.310810 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.412013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.412086 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.412094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.412103 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.412110 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.514063 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.514092 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.514099 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.514108 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.514115 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.615224 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.615256 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.615264 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.615275 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.615283 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.717084 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.717112 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.717120 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.717131 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.717140 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.818870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.818900 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.818909 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.818920 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.818942 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.920653 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.920682 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.920690 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.920706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:19 crc kubenswrapper[4565]: I1125 09:05:19.920715 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:19Z","lastTransitionTime":"2025-11-25T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.022238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.022331 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.022389 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.022471 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.022526 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.124360 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.124390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.124400 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.124411 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.124419 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.226209 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.226275 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.226287 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.226300 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.226310 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.327859 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.327886 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.327895 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.327904 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.327912 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.380340 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.390536 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.392997 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.406822 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.427090 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.429266 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.429287 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.429295 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.429307 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.429314 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.437239 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.446525 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.453737 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.460733 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.468603 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.474702 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.487360 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.495493 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.507012 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.513165 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.519905 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.526552 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.530692 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.530716 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.530723 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.530734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.530743 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.534367 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.541239 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:20Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.632094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.632123 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.632131 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.632142 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.632151 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.733584 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.733612 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.733620 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.733649 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.733656 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.835240 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.835294 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.835305 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.835316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.835324 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.937035 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.937060 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.937080 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.937090 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:20 crc kubenswrapper[4565]: I1125 09:05:20.937096 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:20Z","lastTransitionTime":"2025-11-25T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.038418 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.038521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.038579 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.038639 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.038713 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.096108 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.096138 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:21 crc kubenswrapper[4565]: E1125 09:05:21.096197 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.096292 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:21 crc kubenswrapper[4565]: E1125 09:05:21.096408 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:21 crc kubenswrapper[4565]: E1125 09:05:21.096330 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.096310 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:21 crc kubenswrapper[4565]: E1125 09:05:21.096598 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.140214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.140299 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.140355 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.140420 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.140478 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.242123 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.242149 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.242157 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.242168 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.242176 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.344223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.344353 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.344412 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.344464 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.344518 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.445543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.445668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.445748 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.445834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.445917 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.547463 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.547549 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.547612 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.547665 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.547712 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.649736 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.649759 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.649767 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.649776 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.649783 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.751314 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.751335 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.751342 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.751352 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.751358 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.853071 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.853106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.853116 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.853127 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.853134 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.954900 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.954917 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.954924 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.954954 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:21 crc kubenswrapper[4565]: I1125 09:05:21.954961 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:21Z","lastTransitionTime":"2025-11-25T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.056994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.057017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.057026 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.057034 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.057040 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.158366 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.158385 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.158393 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.158402 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.158408 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.260384 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.260416 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.260424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.260435 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.260445 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.361833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.361854 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.361862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.361871 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.361878 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.463326 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.463352 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.463360 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.463370 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.463377 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.565535 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.565560 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.565567 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.565576 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.565583 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.667438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.667466 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.667494 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.667506 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.667515 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.769050 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.769085 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.769094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.769107 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.769115 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.870833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.870864 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.870874 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.870887 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.870897 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.972999 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.973023 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.973049 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.973059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:22 crc kubenswrapper[4565]: I1125 09:05:22.973066 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:22Z","lastTransitionTime":"2025-11-25T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.074781 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.074804 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.074812 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.074822 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.074829 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.096427 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:23 crc kubenswrapper[4565]: E1125 09:05:23.096506 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.096528 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.096436 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.096567 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:23 crc kubenswrapper[4565]: E1125 09:05:23.096591 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:23 crc kubenswrapper[4565]: E1125 09:05:23.096632 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:23 crc kubenswrapper[4565]: E1125 09:05:23.096741 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.176413 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.176442 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.176452 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.176464 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.176473 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.278085 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.278115 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.278124 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.278134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.278142 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.379563 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.379595 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.379605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.379617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.379625 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.481537 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.481562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.481570 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.481582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.481589 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.583017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.583043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.583052 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.583061 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.583068 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.684602 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.684633 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.684643 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.684653 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.684661 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.786190 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.786217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.786225 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.786234 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.786241 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.888207 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.888238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.888246 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.888260 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.888268 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.990033 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.990075 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.990087 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.990102 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:23 crc kubenswrapper[4565]: I1125 09:05:23.990112 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:23Z","lastTransitionTime":"2025-11-25T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.091957 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.092001 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.092027 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.092040 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.092050 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.193477 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.193513 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.193521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.193553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.193561 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.295354 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.295401 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.295412 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.295424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.295433 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.397562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.397598 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.397607 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.397620 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.397631 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.499814 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.499845 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.499853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.499865 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.499873 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.601614 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.601647 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.601656 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.601668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.601676 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.703241 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.703277 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.703285 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.703298 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.703306 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.804603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.804629 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.804637 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.804649 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.804657 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.907138 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.907199 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.907208 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.907224 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:24 crc kubenswrapper[4565]: I1125 09:05:24.907254 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:24Z","lastTransitionTime":"2025-11-25T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.009497 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.009523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.009532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.009542 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.009550 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.096799 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.096838 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.096892 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:25 crc kubenswrapper[4565]: E1125 09:05:25.097018 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.097229 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:25 crc kubenswrapper[4565]: E1125 09:05:25.097543 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.097663 4565 scope.go:117] "RemoveContainer" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" Nov 25 09:05:25 crc kubenswrapper[4565]: E1125 09:05:25.097707 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:25 crc kubenswrapper[4565]: E1125 09:05:25.097777 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:25 crc kubenswrapper[4565]: E1125 09:05:25.097914 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.110718 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.110746 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.110753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.110763 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.110771 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.212780 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.212808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.212816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.212828 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.212835 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.314748 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.314786 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.314796 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.314807 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.314816 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.416974 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.417037 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.417047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.417061 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.417069 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.518856 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.518890 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.518899 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.518912 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.518920 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.620344 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.620370 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.620378 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.620390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.620398 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.722541 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.722574 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.722582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.722594 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.722602 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.824163 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.824189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.824196 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.824207 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.824216 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.926155 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.926193 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.926203 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.926217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:25 crc kubenswrapper[4565]: I1125 09:05:25.926227 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:25Z","lastTransitionTime":"2025-11-25T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.028092 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.028123 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.028131 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.028144 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.028169 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.130290 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.130322 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.130343 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.130354 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.130362 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.231728 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.231769 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.231779 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.231793 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.231801 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.333770 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.333816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.333824 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.333837 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.333844 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.435745 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.435773 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.435782 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.435793 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.435802 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.537692 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.537735 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.537745 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.537755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.537762 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.639695 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.639726 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.639735 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.639747 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.639755 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.741774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.741799 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.741807 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.741818 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.741825 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.843413 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.843436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.843443 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.843453 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.843462 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.944785 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.944814 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.944823 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.944833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:26 crc kubenswrapper[4565]: I1125 09:05:26.944841 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:26Z","lastTransitionTime":"2025-11-25T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.046417 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.046456 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.046466 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.046481 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.046491 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.096657 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.096689 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.096665 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:27 crc kubenswrapper[4565]: E1125 09:05:27.096757 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.096788 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:27 crc kubenswrapper[4565]: E1125 09:05:27.096836 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:27 crc kubenswrapper[4565]: E1125 09:05:27.096951 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:27 crc kubenswrapper[4565]: E1125 09:05:27.097030 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.108560 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.116638 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.125268 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.132065 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.146399 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.147669 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.147698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.147707 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.147718 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.147726 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.155867 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.168958 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.175631 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.184456 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.191259 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.199652 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.206489 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.215322 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.223022 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.230056 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.237562 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.245965 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.249890 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.249918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.249939 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.249952 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.249960 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.252678 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.351019 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.351047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.351063 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.351078 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.351087 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.452470 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.452523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.452533 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.452546 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.452554 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.554434 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.554477 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.554485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.554499 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.554507 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.656363 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.656415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.656424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.656438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.656446 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.758065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.758088 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.758096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.758106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.758113 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.860089 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.860109 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.860117 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.860126 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.860134 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.961485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.961536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.961546 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.961573 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:27 crc kubenswrapper[4565]: I1125 09:05:27.961581 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:27Z","lastTransitionTime":"2025-11-25T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.063026 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.063066 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.063074 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.063089 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.063097 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.165404 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.165439 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.165448 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.165460 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.165468 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.267512 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.267542 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.267550 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.267562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.267570 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.368992 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.369038 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.369048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.369063 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.369073 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.470481 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.470513 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.470521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.470535 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.470543 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.572178 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.572213 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.572222 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.572234 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.572243 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.673880 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.673914 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.673923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.673976 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.673985 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.750596 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.750632 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.750641 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.750654 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.750662 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.759325 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:28Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.761816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.761843 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.761851 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.761880 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.761903 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.770072 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:28Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.771944 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.771980 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.771991 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.772017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.772026 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.779208 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:28Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.781059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.781087 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.781095 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.781105 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.781113 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.788486 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:28Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.790497 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.790525 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.790533 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.790543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.790549 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.798212 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:28Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:28 crc kubenswrapper[4565]: E1125 09:05:28.798313 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.799168 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.799195 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.799203 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.799216 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.799224 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.900994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.901033 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.901043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.901054 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:28 crc kubenswrapper[4565]: I1125 09:05:28.901062 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:28Z","lastTransitionTime":"2025-11-25T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.002962 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.002990 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.002998 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.003024 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.003032 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.097045 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.097071 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.097093 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.097058 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:29 crc kubenswrapper[4565]: E1125 09:05:29.097183 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:29 crc kubenswrapper[4565]: E1125 09:05:29.097246 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:29 crc kubenswrapper[4565]: E1125 09:05:29.097297 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:29 crc kubenswrapper[4565]: E1125 09:05:29.097342 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.104520 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.104551 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.104560 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.104572 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.104604 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.206432 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.206459 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.206467 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.206477 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.206484 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.308645 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.308668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.308675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.308684 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.308691 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.410409 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.410438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.410446 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.410457 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.410467 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.512529 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.512561 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.512571 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.512582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.512591 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.613656 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.613692 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.613701 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.613711 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.613719 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.715700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.715752 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.715761 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.715774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.715790 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.817101 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.817157 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.817169 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.817180 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.817189 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.919146 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.919515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.919589 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.919666 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:29 crc kubenswrapper[4565]: I1125 09:05:29.919721 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:29Z","lastTransitionTime":"2025-11-25T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.021155 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.021185 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.021193 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.021206 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.021214 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.123359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.123556 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.123626 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.123696 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.123755 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.225204 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.225240 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.225250 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.225263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.225272 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.327565 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.327744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.327809 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.327878 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.327947 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.429757 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.429785 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.429794 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.429806 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.429817 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.531237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.531271 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.531280 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.531292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.531302 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.632477 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.632499 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.632508 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.632518 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.632525 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.734010 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.734046 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.734055 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.734067 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.734075 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.797792 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:30 crc kubenswrapper[4565]: E1125 09:05:30.797962 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:30 crc kubenswrapper[4565]: E1125 09:05:30.798031 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:06:02.798003656 +0000 UTC m=+96.000498794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.835410 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.835444 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.835454 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.835467 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.835475 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.936816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.936847 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.936856 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.936866 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:30 crc kubenswrapper[4565]: I1125 09:05:30.936874 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:30Z","lastTransitionTime":"2025-11-25T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.038621 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.038665 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.038678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.038694 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.038704 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.096639 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.096656 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.096686 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.096696 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:31 crc kubenswrapper[4565]: E1125 09:05:31.096742 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:31 crc kubenswrapper[4565]: E1125 09:05:31.096830 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:31 crc kubenswrapper[4565]: E1125 09:05:31.096877 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:31 crc kubenswrapper[4565]: E1125 09:05:31.096943 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.140483 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.140527 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.140537 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.140549 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.140557 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.242299 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.242350 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.242359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.242369 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.242378 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.344444 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.344497 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.344508 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.344521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.344530 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.446645 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.446706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.446716 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.446727 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.446734 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.548979 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.549002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.549011 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.549049 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.549057 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.650657 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.650688 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.650698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.650708 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.650715 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.752448 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.752582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.752639 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.752700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.752755 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.854619 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.854651 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.854659 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.854672 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.854680 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.955763 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.955847 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.955858 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.955868 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:31 crc kubenswrapper[4565]: I1125 09:05:31.955875 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:31Z","lastTransitionTime":"2025-11-25T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.057885 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.057918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.057968 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.057982 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.057990 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.159358 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.159405 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.159419 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.159436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.159448 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.261243 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.261279 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.261290 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.261302 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.261310 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.362857 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.362891 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.362900 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.362913 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.362922 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.464550 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.464578 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.464588 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.464598 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.464606 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.566583 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.566605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.566613 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.566623 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.566630 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.667558 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.667586 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.667596 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.667605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.667612 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.769555 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.769584 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.769594 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.769603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.769610 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.871014 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.871087 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.871099 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.871110 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.871118 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.972527 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.972918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.973013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.973094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:32 crc kubenswrapper[4565]: I1125 09:05:32.973158 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:32Z","lastTransitionTime":"2025-11-25T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.074319 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.074353 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.074365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.074379 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.074387 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.096888 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.096937 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.096970 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:33 crc kubenswrapper[4565]: E1125 09:05:33.097067 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.097075 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:33 crc kubenswrapper[4565]: E1125 09:05:33.097130 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:33 crc kubenswrapper[4565]: E1125 09:05:33.097287 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:33 crc kubenswrapper[4565]: E1125 09:05:33.097378 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.176432 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.176547 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.176610 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.176669 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.176725 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.278591 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.278615 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.278624 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.278633 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.278640 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.380297 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.380322 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.380330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.380343 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.380352 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.384773 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/0.log" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.384812 4565 generic.go:334] "Generic (PLEG): container finished" podID="6d96c20a-2514-47cf-99ec-a314bacac513" containerID="a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9" exitCode=1 Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.384833 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerDied","Data":"a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.385101 4565 scope.go:117] "RemoveContainer" containerID="a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.394492 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.404031 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.412017 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.422497 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.434870 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.442087 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.448979 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.457456 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.465970 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.472508 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.482011 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.482054 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.482064 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.482076 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.482085 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.486041 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.493694 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.505451 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.512141 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.520845 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.527533 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.536739 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.544823 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:33Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.583831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.583919 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.584020 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.584096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.584162 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.686215 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.686249 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.686259 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.686271 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.686279 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.787584 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.787618 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.787626 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.787638 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.787647 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.889408 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.889441 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.889449 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.889460 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.889467 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.991176 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.991216 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.991227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.991241 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:33 crc kubenswrapper[4565]: I1125 09:05:33.991251 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:33Z","lastTransitionTime":"2025-11-25T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.093793 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.093884 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.093977 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.094048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.094114 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.195742 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.195865 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.195995 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.196083 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.196151 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.298106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.298150 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.298160 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.298173 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.298181 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.388905 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/0.log" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.388968 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerStarted","Data":"a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.399006 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.399986 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.400076 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.400152 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.400213 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.400266 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.407291 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.415679 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.424440 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.433249 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.440015 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.452869 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.461277 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.468846 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.476470 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.483702 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.490092 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.497918 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.501644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.501678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.501689 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.501704 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.501713 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.504950 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.516722 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.523243 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.531482 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.539234 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:34Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.605699 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.605883 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.605971 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.606094 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.606180 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.708634 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.708666 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.708675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.708687 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.708695 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.810966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.810999 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.811008 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.811022 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.811074 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.913112 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.913134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.913143 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.913151 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:34 crc kubenswrapper[4565]: I1125 09:05:34.913158 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:34Z","lastTransitionTime":"2025-11-25T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.014743 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.014763 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.014770 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.014779 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.014786 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.096583 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.096603 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.096638 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.096858 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:35 crc kubenswrapper[4565]: E1125 09:05:35.096967 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:35 crc kubenswrapper[4565]: E1125 09:05:35.097157 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:35 crc kubenswrapper[4565]: E1125 09:05:35.097216 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:35 crc kubenswrapper[4565]: E1125 09:05:35.097250 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.115832 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.115876 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.115887 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.115899 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.115908 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.217700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.217730 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.217739 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.217751 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.217759 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.319044 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.319088 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.319097 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.319108 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.319116 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.421144 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.421181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.421190 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.421205 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.421215 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.522761 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.522795 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.522804 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.522819 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.522827 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.624568 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.624588 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.624596 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.624605 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.624613 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.725847 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.725888 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.725898 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.725911 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.725923 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.827856 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.827882 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.827892 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.827902 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.827910 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.929334 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.929361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.929370 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.929384 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:35 crc kubenswrapper[4565]: I1125 09:05:35.929392 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:35Z","lastTransitionTime":"2025-11-25T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.031003 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.031025 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.031047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.031058 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.031066 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.132492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.132514 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.132523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.132532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.132539 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.234494 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.234516 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.234523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.234531 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.234538 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.336230 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.336252 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.336259 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.336269 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.336275 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.438361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.438383 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.438391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.438404 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.438410 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.540232 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.540265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.540274 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.540289 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.540297 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.641982 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.642015 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.642022 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.642044 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.642053 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.744123 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.744146 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.744156 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.744166 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.744173 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.845557 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.845589 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.845597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.845609 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.845617 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.949543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.949565 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.949572 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.949582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:36 crc kubenswrapper[4565]: I1125 09:05:36.949589 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:36Z","lastTransitionTime":"2025-11-25T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.050835 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.050853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.050860 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.050870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.050877 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.096421 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.096437 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:37 crc kubenswrapper[4565]: E1125 09:05:37.096495 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.096595 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.096421 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:37 crc kubenswrapper[4565]: E1125 09:05:37.096647 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:37 crc kubenswrapper[4565]: E1125 09:05:37.097003 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:37 crc kubenswrapper[4565]: E1125 09:05:37.097080 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.106854 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.114624 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.127517 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.133571 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.140161 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.150748 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.152288 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.152312 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.152320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.152331 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.152340 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.159757 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.168724 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.175799 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.183294 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.191223 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.199954 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.212084 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.219305 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.226144 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.236428 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.245446 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.251823 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.253823 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.253846 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.253854 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.253864 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.253871 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.355112 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.355140 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.355149 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.355160 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.355166 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.456870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.456896 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.456906 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.456918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.456942 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.558446 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.558486 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.558495 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.558504 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.558511 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.660488 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.660515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.660523 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.660532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.660541 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.762077 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.762109 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.762156 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.762170 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.762180 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.863841 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.863955 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.864024 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.864100 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.864157 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.965987 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.966017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.966040 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.966052 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:37 crc kubenswrapper[4565]: I1125 09:05:37.966061 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:37Z","lastTransitionTime":"2025-11-25T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.067346 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.067377 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.067386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.067396 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.067403 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.097009 4565 scope.go:117] "RemoveContainer" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.169054 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.169084 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.169093 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.169105 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.169114 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.271176 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.271206 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.271214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.271227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.271235 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.373238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.373284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.373294 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.373309 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.373319 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.398718 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/2.log" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.400734 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.401491 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.415189 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.423512 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.436235 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.444581 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.454237 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.463328 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.471453 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.474657 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.474690 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.474698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.474711 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.474719 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.479230 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.494017 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.504653 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.513830 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.522354 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.534394 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.542093 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.550442 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.559138 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.567031 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576299 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576471 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576492 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576500 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576513 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.576521 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.677826 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.677853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.677862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.677882 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.677891 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.779960 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.780011 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.780021 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.780041 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.780051 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.881842 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.881863 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.881870 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.881879 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.881886 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.928594 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.928623 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.928631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.928641 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.928648 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.938724 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.940960 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.940981 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.940989 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.941002 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.941010 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.948894 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.953682 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.953724 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.953737 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.953752 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.953770 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.964089 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.966350 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.966383 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.966394 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.966407 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.966415 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.974436 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.976536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.976562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.976571 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.976581 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.976588 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.984294 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:38 crc kubenswrapper[4565]: E1125 09:05:38.984396 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.985297 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.985385 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.985441 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.985496 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:38 crc kubenswrapper[4565]: I1125 09:05:38.985571 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:38Z","lastTransitionTime":"2025-11-25T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.087556 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.087949 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.088017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.088093 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.088145 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.097041 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.097052 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.097054 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.097097 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:39 crc kubenswrapper[4565]: E1125 09:05:39.097120 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:39 crc kubenswrapper[4565]: E1125 09:05:39.097200 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:39 crc kubenswrapper[4565]: E1125 09:05:39.097279 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:39 crc kubenswrapper[4565]: E1125 09:05:39.097313 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.189654 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.189681 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.189691 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.189701 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.189709 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.291338 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.291361 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.291369 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.291382 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.291390 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.392972 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.392999 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.393008 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.393017 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.393031 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.404158 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/3.log" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.404623 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/2.log" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.406398 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" exitCode=1 Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.406428 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.406449 4565 scope.go:117] "RemoveContainer" containerID="c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.406904 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:05:39 crc kubenswrapper[4565]: E1125 09:05:39.407413 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.419200 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.428011 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.436017 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.444598 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.452359 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.459662 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.467246 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.475311 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.481720 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494016 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494251 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494272 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494280 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494291 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.494299 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.501587 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.508623 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.514652 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.521407 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.527636 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.534997 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.541998 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.553528 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9dd4505d5db133081318645bc2b30a0440a5856d7cd583df53d97d83c2f28f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:09Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:09Z is after 2025-08-24T17:21:41Z]\\\\nI1125 09:05:09.705848 6115 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"ervices.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 09:05:38.720221 6511 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 09:05:38.720715 6511 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1125 09:05:38.720719 6511 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1125 09:05:38.720723 6511 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1125 09:05:38.718903 6511 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.595738 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.595765 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.595774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.595786 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.595795 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.697833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.697976 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.698051 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.698128 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.698192 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.800111 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.800223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.800284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.800342 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.800403 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.901740 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.901792 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.901801 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.901812 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:39 crc kubenswrapper[4565]: I1125 09:05:39.901819 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:39Z","lastTransitionTime":"2025-11-25T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.003336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.003547 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.003572 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.003584 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.003654 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.105974 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.106011 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.106031 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.106042 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.106051 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.208101 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.208127 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.208135 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.208144 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.208151 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.309558 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.309581 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.309589 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.309600 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.309608 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.409793 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/3.log" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.410772 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.410794 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.410803 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.410813 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.410819 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.412491 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:05:40 crc kubenswrapper[4565]: E1125 09:05:40.412610 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.419601 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.430986 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"ervices.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 09:05:38.720221 6511 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 09:05:38.720715 6511 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1125 09:05:38.720719 6511 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1125 09:05:38.720723 6511 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1125 09:05:38.718903 6511 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.437175 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.443895 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.450948 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.458842 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.466826 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.475418 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.483245 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.492154 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.501181 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.510873 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.512256 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.512347 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.512426 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.512510 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.512577 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.518999 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.526362 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.534400 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.543184 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.550218 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.563064 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.614280 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.614341 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.614351 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.614362 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.614370 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.716237 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.716345 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.716403 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.716459 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.716508 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.818554 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.818581 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.818590 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.818601 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.818609 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.919886 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.919910 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.919918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.919954 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:40 crc kubenswrapper[4565]: I1125 09:05:40.919963 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:40Z","lastTransitionTime":"2025-11-25T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.021738 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.021759 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.021767 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.021777 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.021785 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.096393 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.096416 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:41 crc kubenswrapper[4565]: E1125 09:05:41.096468 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.096482 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.096398 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:41 crc kubenswrapper[4565]: E1125 09:05:41.096594 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:41 crc kubenswrapper[4565]: E1125 09:05:41.096621 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:41 crc kubenswrapper[4565]: E1125 09:05:41.096668 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.123029 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.123051 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.123060 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.123070 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.123077 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.224403 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.224429 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.224438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.224449 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.224459 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.326143 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.326202 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.326214 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.326227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.326236 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.428207 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.428234 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.428243 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.428255 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.428280 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.529551 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.529666 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.529739 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.529800 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.529853 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.632320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.632345 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.632354 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.632365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.632375 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.734100 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.734147 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.734158 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.734172 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.734181 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.836153 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.836183 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.836192 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.836223 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.836231 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.938333 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.938365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.938377 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.938390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:41 crc kubenswrapper[4565]: I1125 09:05:41.938399 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:41Z","lastTransitionTime":"2025-11-25T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.040701 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.040732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.040742 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.040754 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.040762 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.142836 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.142887 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.142898 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.142911 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.142920 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.244722 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.244755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.244774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.244785 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.244793 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.346114 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.346143 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.346159 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.346172 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.346180 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.447732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.447753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.447760 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.447770 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.447777 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.549490 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.549592 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.549670 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.549733 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.549786 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.651556 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.651617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.651628 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.651643 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.651654 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.753067 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.753096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.753106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.753116 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.753125 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.854392 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.854415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.854423 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.854431 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.854440 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.956341 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.956371 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.956379 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.956391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:42 crc kubenswrapper[4565]: I1125 09:05:42.956404 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:42Z","lastTransitionTime":"2025-11-25T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.057755 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.057788 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.057797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.057809 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.057818 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.096334 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.096368 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.096332 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:43 crc kubenswrapper[4565]: E1125 09:05:43.096436 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.096488 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:43 crc kubenswrapper[4565]: E1125 09:05:43.096576 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:43 crc kubenswrapper[4565]: E1125 09:05:43.096741 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:43 crc kubenswrapper[4565]: E1125 09:05:43.096869 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.159345 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.159395 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.159405 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.159417 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.159425 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.260917 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.260961 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.260970 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.260979 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.260987 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.361811 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.361856 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.361865 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.361876 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.361883 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.464040 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.464068 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.464078 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.464088 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.464095 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.565407 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.565441 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.565450 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.565464 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.565474 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.667334 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.667372 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.667384 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.667397 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.667406 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.769340 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.769373 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.769385 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.769396 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.769404 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.871311 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.871376 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.871386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.871397 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.871404 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.973048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.973096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.973107 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.973117 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:43 crc kubenswrapper[4565]: I1125 09:05:43.973126 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:43Z","lastTransitionTime":"2025-11-25T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.074717 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.074765 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.074776 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.074790 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.074799 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.176451 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.176498 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.176507 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.176521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.176529 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.278386 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.278493 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.278569 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.278632 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.278694 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.380699 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.380790 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.380853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.380918 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.380998 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.482581 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.482708 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.482764 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.482820 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.482875 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.584247 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.584306 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.584316 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.584327 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.584337 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.685832 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.685862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.685871 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.685881 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.685904 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.787405 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.787522 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.787586 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.787639 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.787688 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.888786 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.888808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.888835 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.888845 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.888853 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.990701 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.990726 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.990734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.990744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:44 crc kubenswrapper[4565]: I1125 09:05:44.990751 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:44Z","lastTransitionTime":"2025-11-25T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.091856 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.091909 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.091920 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.091948 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.091958 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.097154 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.097193 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:45 crc kubenswrapper[4565]: E1125 09:05:45.097266 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.097289 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.097158 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:45 crc kubenswrapper[4565]: E1125 09:05:45.097358 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:45 crc kubenswrapper[4565]: E1125 09:05:45.097420 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:45 crc kubenswrapper[4565]: E1125 09:05:45.097497 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.193756 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.193784 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.193792 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.193802 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.193811 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.295791 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.295816 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.295824 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.295833 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.295840 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.397279 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.397309 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.397317 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.397330 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.397338 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.498548 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.498582 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.498590 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.498603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.498612 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.601112 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.601141 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.601167 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.601179 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.601187 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.702799 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.702862 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.702873 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.702886 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.702896 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.804802 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.804826 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.804835 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.804845 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.804851 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.907089 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.907118 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.907128 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.907139 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:45 crc kubenswrapper[4565]: I1125 09:05:45.907147 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:45Z","lastTransitionTime":"2025-11-25T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.009160 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.009189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.009197 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.009209 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.009217 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.110587 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.110700 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.110866 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.110952 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.111027 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.212134 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.212166 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.212176 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.212186 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.212193 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.313403 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.313430 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.313438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.313449 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.313457 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.414648 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.414672 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.414701 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.414713 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.414721 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.515867 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.515897 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.515907 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.515920 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.515947 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.617423 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.617457 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.617488 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.617501 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.617511 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.719373 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.719562 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.719630 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.719698 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.719764 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.824682 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.824714 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.824725 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.824738 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.824746 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.926540 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.926601 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.926609 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.926622 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:46 crc kubenswrapper[4565]: I1125 09:05:46.926630 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:46Z","lastTransitionTime":"2025-11-25T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.028520 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.028554 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.028563 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.028576 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.028586 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.096320 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:47 crc kubenswrapper[4565]: E1125 09:05:47.096409 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.096443 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.096479 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.096546 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:47 crc kubenswrapper[4565]: E1125 09:05:47.096633 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:47 crc kubenswrapper[4565]: E1125 09:05:47.096768 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:47 crc kubenswrapper[4565]: E1125 09:05:47.096881 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.105340 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91a29f2c-150e-44ca-ac6b-01c74197d120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5495757ee21b1874ef175f98308016e06007f6a55bc7b55376b82cf291878a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48dee57ce020adcee27aca2d8950c9cebead8c430113c4c151b08babac9299\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129aed1548617b0b63fe023e6112d5be72db903bcc44a95a376cd3f42be0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f13c927d9561a2a8f119338d4774db6f33bdd828cafcd0778c7a569da5526f56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.113211 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.122148 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jpfp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d96c20a-2514-47cf-99ec-a314bacac513\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:32Z\\\",\\\"message\\\":\\\"2025-11-25T09:04:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285\\\\n2025-11-25T09:04:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aef70dfb-3bc9-425e-981d-272c2640c285 to /host/opt/cni/bin/\\\\n2025-11-25T09:04:47Z [verbose] multus-daemon started\\\\n2025-11-25T09:04:47Z [verbose] Readiness Indicator file check\\\\n2025-11-25T09:05:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jpfp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.130723 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.130744 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.130753 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.130764 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.130773 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.132447 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28292a27-3521-4953-af83-48804d5ed947\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b454e10f5a27e7fcf592c6e895778990ee13d94756f5bd6a0ae252e1b00ade5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://018af64b3cd9ac85c6cac096fefa2c1ab769fa572cdbde6fcadab7e58a546d9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f412eeddb63685c8564e13bd4d763b1da0d15cfdacf5a50de3c4149c7a7a253\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950a5feff5d7bff766f44081470d0d0d4757c4f66be44640925cbb6322f8485c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0e3d42ba0117e56278e9db2bdb536a7efa68640f7062469f01a5c75230a0de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94317cd3e3383c2526fcbccca973110cf424327991d3a7447bc41f0bca1eb975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49acd95a632bb6909bcbe034a7e2b4b0e9d5ea2607b023562a11edf006adfc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9j9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pmkqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.144825 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bca5e724-f24a-444e-836e-63ef46b0b9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3abfc1130e18f79bbc5585e65e7c60f597802b9e7051a368725290078c4abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca8056666f4ff800f22b729469ae0bc0548d3253591863c0393bd0e70a62c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd2ca3c7fd66612d96029c56cec6c43094cc7b6b4526fb766c31a29bf11cb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f769bda362c16caa082fbc26059027bb5c007b3e53e3b3f053ba10988ae51a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3680c32beaa2af0c1eb8f37e5fa1621e6b68b0d4b04384059d24dca3e267bfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a0b8b23f80dc582282444c7a62aa4e27f549f51a716a78aa2cb7dcc6993b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://debe7917b4beac9ca7b1dedd0a74333a49b34819282f8b8074359276d041cff5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1dd8d081a89a92f9dcf36ac7a75bccc2caea1d76148f5c9db3ccb1572ff6a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.155633 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.162980 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c813dc80baa3d53b41cd9da07b11f981e5cc3aa10a61324b7f5843ba462e2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.170263 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.178651 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88d9654adf8c109875b59c28187b064b2371f9d1d88ce6946b60193c55c8905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.185548 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cvvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb18f9-1819-4221-9486-4d042cd042d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ba71343a647a20f069f43bf56df3e21b25ab85771b597563e13ad2b2b1d2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pn7n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cvvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.193310 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37ddeba2-a699-4fa8-8d60-1833dcea3ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef4836810fde68bdc0c483c4e4234a31bbdc3ed7466b99eb124eb490a6249f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c2803110205ee1ed4ddeb9af787f0e97bbe3d590837408e477f22d08a5903dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1655f8c06d4e759cd441658fff7989665a51f9a5135e972d5fcd227af7b8f6a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.200794 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bad26f-53b0-48f7-9ac4-110d3d8a475d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a2ec66a4f9b218daa8c6f49e0bc806d5c23fd863489e559d009df956abac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q97tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-r28bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.212292 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e95c48-8d61-4222-a968-b86203ef8aab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T09:05:38Z\\\",\\\"message\\\":\\\"ervices.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 09:05:38.720221 6511 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 09:05:38.720715 6511 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1125 09:05:38.720719 6511 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1125 09:05:38.720723 6511 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1125 09:05:38.718903 6511 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vk74d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.218351 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpgqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03764f22-c722-4de2-986b-9236cd9ef0af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc580c4651fa735ba8c61ce2b440a167be9f2a2cc29c8cbce372755066e377f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hj72n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpgqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.225719 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be22f021-a051-4111-ba40-782e0c85f8b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114692b6925665d9a848950509e4a25a930fe69265b1db41a70f83a3bf10acdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0802c4e34d0a2da251d035aaf1249d86d594079eaf4d59a2939436fa9dd06d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8pj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dd8cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232043 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232080 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232091 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232116 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.232457 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b047b2-31c7-45e7-a944-8d9c6de61061\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4f64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fzpzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.240739 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93d20159-72d2-4207-9884-03b4ea42de14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T09:04:44Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 09:04:44.173870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 09:04:44.174504 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 09:04:44.176025 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-698469396/tls.crt::/tmp/serving-cert-698469396/tls.key\\\\\\\"\\\\nI1125 09:04:44.375369 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 09:04:44.378870 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 09:04:44.378917 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 09:04:44.378973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 09:04:44.378997 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 09:04:44.382585 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 09:04:44.382637 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 09:04:44.382659 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382663 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 09:04:44.382666 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 09:04:44.382669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 09:04:44.382672 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 09:04:44.382675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 09:04:44.383210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T09:04:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.248278 4565 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T09:04:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c48c0c7ce8a2df02cae32e8edd61541ed0b632bb19363b9e3ae3e6ab139f8d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c523477c6500d813c2a84752f40f3ee9c63b2d6f95c436c30470f27963be4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T09:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.334250 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.334279 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.334288 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.334300 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.334308 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.436322 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.436357 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.436368 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.436380 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.436387 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.538643 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.538672 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.538681 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.538693 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.538701 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.640693 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.640715 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.640723 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.640732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.640739 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.742813 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.742841 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.742851 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.742877 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.742885 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.844719 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.844750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.844758 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.844767 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.844775 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.946575 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.946607 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.946618 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.946632 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:47 crc kubenswrapper[4565]: I1125 09:05:47.946640 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:47Z","lastTransitionTime":"2025-11-25T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.048436 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.048467 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.048475 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.048486 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.048493 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.150461 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.150487 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.150496 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.150507 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.150515 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.252258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.252290 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.252321 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.252331 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.252338 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.354158 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.354189 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.354199 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.354211 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.354221 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.455392 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.455423 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.455433 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.455471 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.455480 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.557390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.557415 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.557424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.557435 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.557464 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.659577 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.659600 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.659607 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.659617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.659624 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.760712 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.760748 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.760757 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.760772 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.760780 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.862469 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.862497 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.862506 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.862515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.862524 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.934054 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:05:48 crc kubenswrapper[4565]: E1125 09:05:48.934160 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.934138297 +0000 UTC m=+146.136633445 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.963983 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.964026 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.964037 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.964048 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:48 crc kubenswrapper[4565]: I1125 09:05:48.964057 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:48Z","lastTransitionTime":"2025-11-25T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.034741 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.034774 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.034804 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.034822 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034827 4565 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034871 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:06:53.034858976 +0000 UTC m=+146.237354114 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034899 4565 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034943 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034961 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034970 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034983 4565 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034990 4565 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034971 4565 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.034946 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 09:06:53.034922786 +0000 UTC m=+146.237417924 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.035048 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 09:06:53.035041669 +0000 UTC m=+146.237536808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.035058 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 09:06:53.035053322 +0000 UTC m=+146.237548460 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.065805 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.065831 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.065839 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.065849 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.065858 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.096140 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.096171 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.096192 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.096291 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.096323 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.096417 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.096495 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.096542 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.167440 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.167465 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.167474 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.167484 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.167492 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.187513 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.187545 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.187560 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.187573 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.187581 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.195827 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.198014 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.198036 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.198045 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.198055 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.198062 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.205116 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.206892 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.206914 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.206921 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.206950 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.206958 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.213949 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.215766 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.215859 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.216060 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.216173 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.216226 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.223316 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.225129 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.225152 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.225160 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.225170 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.225177 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.232718 4565 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d91d380a-1f82-4c23-9139-1b88f9b7dd73\\\",\\\"systemUUID\\\":\\\"717cb293-950d-4b28-956b-07370f319336\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T09:05:49Z is after 2025-08-24T17:21:41Z" Nov 25 09:05:49 crc kubenswrapper[4565]: E1125 09:05:49.232826 4565 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.269023 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.269050 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.269059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.269071 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.269080 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.371072 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.371100 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.371108 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.371120 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.371129 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.472714 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.472750 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.472758 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.472772 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.472781 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.574654 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.574685 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.574694 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.574706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.574719 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.676287 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.676314 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.676323 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.676336 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.676344 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.778422 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.778441 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.778449 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.778459 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.778466 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.880359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.880381 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.880390 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.880404 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.880413 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.981732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.981766 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.981775 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.981789 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:49 crc kubenswrapper[4565]: I1125 09:05:49.981798 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:49Z","lastTransitionTime":"2025-11-25T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.083438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.083465 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.083474 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.083485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.083492 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.185174 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.185209 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.185218 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.185232 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.185240 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.287207 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.287238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.287246 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.287257 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.287266 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.388438 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.388473 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.388481 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.388494 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.388503 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.490568 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.490629 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.490638 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.490651 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.490659 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.592563 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.592619 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.592631 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.592644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.592654 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.694194 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.694227 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.694236 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.694248 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.694257 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.795811 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.795858 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.795867 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.795881 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.795888 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.897514 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.897535 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.897543 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.897553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.897561 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.999429 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.999465 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.999476 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.999489 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:50 crc kubenswrapper[4565]: I1125 09:05:50.999500 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:50Z","lastTransitionTime":"2025-11-25T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.097323 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:51 crc kubenswrapper[4565]: E1125 09:05:51.097411 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.097477 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.097751 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:51 crc kubenswrapper[4565]: E1125 09:05:51.097909 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.098112 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:51 crc kubenswrapper[4565]: E1125 09:05:51.098258 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:51 crc kubenswrapper[4565]: E1125 09:05:51.098462 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.101181 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.102041 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.102050 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.102060 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.102068 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.203699 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.203731 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.203760 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.203774 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.203783 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.305546 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.305571 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.305579 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.305588 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.305635 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.407676 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.407694 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.407702 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.407710 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.407718 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.510077 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.510116 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.510124 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.510136 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.510144 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.611908 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.611951 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.611960 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.611972 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.611979 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.713224 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.713250 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.713273 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.713284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.713293 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.814942 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.814973 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.814983 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.815004 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.815014 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.917778 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.917817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.917825 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.917839 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:51 crc kubenswrapper[4565]: I1125 09:05:51.917853 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:51Z","lastTransitionTime":"2025-11-25T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.019216 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.019244 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.019251 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.019263 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.019272 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.103011 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.168550 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.168592 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.168603 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.168616 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.168625 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.270320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.270350 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.270359 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.270371 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.270380 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.371962 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.372024 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.372038 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.372057 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.372069 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.490489 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.490512 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.490521 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.490532 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.490539 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.592031 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.592056 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.592065 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.592075 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.592083 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.693678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.693703 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.693711 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.693721 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.693729 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.795391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.795417 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.795425 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.795435 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.795442 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.896834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.896857 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.896866 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.896875 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.896883 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.998808 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.998835 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.998844 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.998853 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:52 crc kubenswrapper[4565]: I1125 09:05:52.998860 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:52Z","lastTransitionTime":"2025-11-25T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.097060 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.097098 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.097060 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.097060 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:53 crc kubenswrapper[4565]: E1125 09:05:53.097148 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:53 crc kubenswrapper[4565]: E1125 09:05:53.097299 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:53 crc kubenswrapper[4565]: E1125 09:05:53.097349 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:53 crc kubenswrapper[4565]: E1125 09:05:53.097407 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.099937 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.099958 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.099966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.099975 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.099992 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.201644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.201666 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.201674 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.201684 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.201691 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.303201 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.303230 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.303238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.303247 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.303255 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.404749 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.404797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.404805 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.404814 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.404822 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.506572 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.506597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.506604 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.506614 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.506621 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.607457 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.607482 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.607491 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.607502 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.607510 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.708960 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.708995 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.709003 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.709011 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.709018 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.809944 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.809966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.809975 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.809992 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.809999 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.911567 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.911600 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.911608 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.911617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:53 crc kubenswrapper[4565]: I1125 09:05:53.911623 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:53Z","lastTransitionTime":"2025-11-25T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.012481 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.012517 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.012526 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.012536 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.012543 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.114325 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.114380 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.114389 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.114398 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.114406 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.216329 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.216349 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.216357 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.216368 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.216375 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.317854 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.317879 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.317887 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.317897 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.317903 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.419627 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.419657 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.419665 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.419675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.419682 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.521383 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.521406 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.521414 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.521424 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.521431 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.623451 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.623599 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.623671 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.623732 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.623792 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.725036 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.725059 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.725068 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.725076 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.725083 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.826457 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.826485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.826495 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.826505 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.826513 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.928062 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.928087 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.928096 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.928107 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:54 crc kubenswrapper[4565]: I1125 09:05:54.928115 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:54Z","lastTransitionTime":"2025-11-25T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.029597 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.029617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.029625 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.029634 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.029640 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.096259 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.096290 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:55 crc kubenswrapper[4565]: E1125 09:05:55.096346 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.096372 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.096411 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:55 crc kubenswrapper[4565]: E1125 09:05:55.096521 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:55 crc kubenswrapper[4565]: E1125 09:05:55.096564 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:55 crc kubenswrapper[4565]: E1125 09:05:55.096630 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.097050 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:05:55 crc kubenswrapper[4565]: E1125 09:05:55.097164 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.130909 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.130966 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.130975 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.130994 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.131002 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.232287 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.232308 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.232315 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.232324 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.232332 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.333797 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.333817 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.333825 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.333834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.333840 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.434896 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.434916 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.434923 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.434947 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.434954 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.536804 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.536841 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.536854 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.536876 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.536887 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.638656 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.638675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.638683 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.638691 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.638698 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.739965 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.739997 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.740005 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.740013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.740020 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.841805 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.841826 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.841834 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.841842 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.841849 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.943232 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.943258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.943267 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.943276 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:55 crc kubenswrapper[4565]: I1125 09:05:55.943283 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:55Z","lastTransitionTime":"2025-11-25T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.044954 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.044988 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.044997 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.045006 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.045013 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.146505 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.146530 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.146540 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.146550 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.146557 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.247636 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.247656 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.247664 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.247672 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.247679 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.349594 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.349627 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.349636 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.349651 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.349659 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.451686 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.451721 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.451730 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.451741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.451748 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.553696 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.553726 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.553734 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.553745 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.553755 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.655916 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.655963 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.655983 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.655995 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.656003 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.757452 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.757485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.757493 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.757506 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.757519 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.859203 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.859391 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.859451 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.859515 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.859571 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.960641 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.960661 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.960668 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.960678 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:56 crc kubenswrapper[4565]: I1125 09:05:56.960685 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:56Z","lastTransitionTime":"2025-11-25T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.062258 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.062284 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.062292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.062302 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.062309 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.096466 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.096502 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:57 crc kubenswrapper[4565]: E1125 09:05:57.096559 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.096583 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:57 crc kubenswrapper[4565]: E1125 09:05:57.096642 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:57 crc kubenswrapper[4565]: E1125 09:05:57.096681 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.096751 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:57 crc kubenswrapper[4565]: E1125 09:05:57.096858 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.120005 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pmkqf" podStartSLOduration=72.119992042 podStartE2EDuration="1m12.119992042s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.113056661 +0000 UTC m=+90.315551799" watchObservedRunningTime="2025-11-25 09:05:57.119992042 +0000 UTC m=+90.322487171" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.120151 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.120147796 podStartE2EDuration="5.120147796s" podCreationTimestamp="2025-11-25 09:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.118701234 +0000 UTC m=+90.321196373" watchObservedRunningTime="2025-11-25 09:05:57.120147796 +0000 UTC m=+90.322642934" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.133494 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.133483614 podStartE2EDuration="37.133483614s" podCreationTimestamp="2025-11-25 09:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.12637748 +0000 UTC m=+90.328872618" watchObservedRunningTime="2025-11-25 09:05:57.133483614 +0000 UTC m=+90.335978752" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.142505 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jpfp5" podStartSLOduration=72.142491165 podStartE2EDuration="1m12.142491165s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.142320514 +0000 UTC m=+90.344815651" watchObservedRunningTime="2025-11-25 09:05:57.142491165 +0000 UTC m=+90.344986303" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164365 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164553 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164618 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164384 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cvvlr" podStartSLOduration=73.164373105 podStartE2EDuration="1m13.164373105s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.164204107 +0000 UTC m=+90.366699246" watchObservedRunningTime="2025-11-25 09:05:57.164373105 +0000 UTC m=+90.366868244" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164686 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.164848 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.219823 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.21980837 podStartE2EDuration="1m8.21980837s" podCreationTimestamp="2025-11-25 09:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.211061669 +0000 UTC m=+90.413556807" watchObservedRunningTime="2025-11-25 09:05:57.21980837 +0000 UTC m=+90.422303508" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.243741 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dd8cl" podStartSLOduration=72.243726894 podStartE2EDuration="1m12.243726894s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.243513421 +0000 UTC m=+90.446008559" watchObservedRunningTime="2025-11-25 09:05:57.243726894 +0000 UTC m=+90.446222032" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.259469 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=72.259455233 podStartE2EDuration="1m12.259455233s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.259282277 +0000 UTC m=+90.461777416" watchObservedRunningTime="2025-11-25 09:05:57.259455233 +0000 UTC m=+90.461950370" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.267247 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.267273 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.267282 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.267292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.267300 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.284226 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podStartSLOduration=72.284218055 podStartE2EDuration="1m12.284218055s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.269601698 +0000 UTC m=+90.472096837" watchObservedRunningTime="2025-11-25 09:05:57.284218055 +0000 UTC m=+90.486713193" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.300562 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.300552596 podStartE2EDuration="1m12.300552596s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.300515956 +0000 UTC m=+90.503011095" watchObservedRunningTime="2025-11-25 09:05:57.300552596 +0000 UTC m=+90.503047734" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.301092 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dpgqk" podStartSLOduration=73.301085768 podStartE2EDuration="1m13.301085768s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:05:57.290905351 +0000 UTC m=+90.493400489" watchObservedRunningTime="2025-11-25 09:05:57.301085768 +0000 UTC m=+90.503580907" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.369186 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.369215 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.369224 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.369235 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.369245 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.471139 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.471173 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.471182 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.471194 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.471203 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.573292 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.573320 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.573328 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.573337 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.573344 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.674569 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.674598 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.674606 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.674617 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.674625 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.776485 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.776512 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.776520 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.776530 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.776537 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.878205 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.878230 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.878238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.878249 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.878257 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.979706 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.979837 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.979908 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.980013 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:57 crc kubenswrapper[4565]: I1125 09:05:57.980068 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:57Z","lastTransitionTime":"2025-11-25T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.081741 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.081875 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.081957 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.082046 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.082115 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.183772 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.183821 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.183830 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.183842 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.183850 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.285493 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.285525 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.285533 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.285545 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.285553 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.387611 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.387636 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.387644 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.387655 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.387662 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.489010 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.489031 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.489038 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.489047 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.489058 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.590550 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.590583 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.590593 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.590606 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.590615 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.691959 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.692000 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.692010 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.692022 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.692029 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.794029 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.794085 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.794095 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.794106 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.794113 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.896209 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.896256 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.896265 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.896276 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.896286 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.998619 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.998655 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.998663 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.998675 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:58 crc kubenswrapper[4565]: I1125 09:05:58.998683 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:58Z","lastTransitionTime":"2025-11-25T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.096601 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.096788 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.096904 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:05:59 crc kubenswrapper[4565]: E1125 09:05:59.096899 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.096943 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:05:59 crc kubenswrapper[4565]: E1125 09:05:59.097045 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:05:59 crc kubenswrapper[4565]: E1125 09:05:59.097109 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:05:59 crc kubenswrapper[4565]: E1125 09:05:59.097173 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.100178 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.100217 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.100226 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.100238 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.100247 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.201639 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.201677 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.201687 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.201702 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.201711 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.302896 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.302951 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.302964 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.302987 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.302995 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.404703 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.404737 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.404747 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.404758 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.404767 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.505889 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.505942 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.505956 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.505977 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.505986 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.590108 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.590140 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.590148 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.590177 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.590190 4565 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T09:05:59Z","lastTransitionTime":"2025-11-25T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.619496 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8"] Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.619830 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.621029 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.621336 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.621527 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.622162 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.743451 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e49ec1-5731-46a7-b9fc-96d552122a36-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.743486 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11e49ec1-5731-46a7-b9fc-96d552122a36-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.743502 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.743525 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.743550 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11e49ec1-5731-46a7-b9fc-96d552122a36-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844125 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11e49ec1-5731-46a7-b9fc-96d552122a36-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844175 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e49ec1-5731-46a7-b9fc-96d552122a36-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844210 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844234 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844252 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11e49ec1-5731-46a7-b9fc-96d552122a36-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844279 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.844328 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/11e49ec1-5731-46a7-b9fc-96d552122a36-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.845066 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11e49ec1-5731-46a7-b9fc-96d552122a36-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.848370 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e49ec1-5731-46a7-b9fc-96d552122a36-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.856757 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11e49ec1-5731-46a7-b9fc-96d552122a36-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4clc8\" (UID: \"11e49ec1-5731-46a7-b9fc-96d552122a36\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:05:59 crc kubenswrapper[4565]: I1125 09:05:59.930731 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" Nov 25 09:06:00 crc kubenswrapper[4565]: I1125 09:06:00.452280 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" event={"ID":"11e49ec1-5731-46a7-b9fc-96d552122a36","Type":"ContainerStarted","Data":"f243dad7ca11935fb6af49dd49240538fa238ceeddc132e4702cc5c7dbe5f9d7"} Nov 25 09:06:00 crc kubenswrapper[4565]: I1125 09:06:00.452499 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" event={"ID":"11e49ec1-5731-46a7-b9fc-96d552122a36","Type":"ContainerStarted","Data":"62e89c343628dae12d8cc89f684b093e91d5702d5fdaa89fa12105ab03d74d1e"} Nov 25 09:06:01 crc kubenswrapper[4565]: I1125 09:06:01.096664 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:01 crc kubenswrapper[4565]: I1125 09:06:01.096720 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:01 crc kubenswrapper[4565]: E1125 09:06:01.096769 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:01 crc kubenswrapper[4565]: I1125 09:06:01.096736 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:01 crc kubenswrapper[4565]: E1125 09:06:01.096821 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:01 crc kubenswrapper[4565]: I1125 09:06:01.096723 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:01 crc kubenswrapper[4565]: E1125 09:06:01.096893 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:01 crc kubenswrapper[4565]: E1125 09:06:01.096973 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:02 crc kubenswrapper[4565]: I1125 09:06:02.868849 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:02 crc kubenswrapper[4565]: E1125 09:06:02.869000 4565 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:06:02 crc kubenswrapper[4565]: E1125 09:06:02.869045 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs podName:b5b047b2-31c7-45e7-a944-8d9c6de61061 nodeName:}" failed. No retries permitted until 2025-11-25 09:07:06.869032306 +0000 UTC m=+160.071527444 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs") pod "network-metrics-daemon-fzpzk" (UID: "b5b047b2-31c7-45e7-a944-8d9c6de61061") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 09:06:03 crc kubenswrapper[4565]: I1125 09:06:03.096337 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:03 crc kubenswrapper[4565]: I1125 09:06:03.096377 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:03 crc kubenswrapper[4565]: I1125 09:06:03.096398 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:03 crc kubenswrapper[4565]: E1125 09:06:03.096429 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:03 crc kubenswrapper[4565]: I1125 09:06:03.096469 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:03 crc kubenswrapper[4565]: E1125 09:06:03.096498 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:03 crc kubenswrapper[4565]: E1125 09:06:03.096578 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:03 crc kubenswrapper[4565]: E1125 09:06:03.096653 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:05 crc kubenswrapper[4565]: I1125 09:06:05.096462 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:05 crc kubenswrapper[4565]: I1125 09:06:05.096476 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:05 crc kubenswrapper[4565]: I1125 09:06:05.096549 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:05 crc kubenswrapper[4565]: E1125 09:06:05.096630 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:05 crc kubenswrapper[4565]: I1125 09:06:05.096643 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:05 crc kubenswrapper[4565]: E1125 09:06:05.096734 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:05 crc kubenswrapper[4565]: E1125 09:06:05.096756 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:05 crc kubenswrapper[4565]: E1125 09:06:05.096801 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:07 crc kubenswrapper[4565]: I1125 09:06:07.096702 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:07 crc kubenswrapper[4565]: I1125 09:06:07.096731 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:07 crc kubenswrapper[4565]: E1125 09:06:07.096786 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:07 crc kubenswrapper[4565]: I1125 09:06:07.096893 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:07 crc kubenswrapper[4565]: E1125 09:06:07.098669 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:07 crc kubenswrapper[4565]: E1125 09:06:07.099169 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:07 crc kubenswrapper[4565]: I1125 09:06:07.099554 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:07 crc kubenswrapper[4565]: E1125 09:06:07.099995 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:09 crc kubenswrapper[4565]: I1125 09:06:09.096163 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:09 crc kubenswrapper[4565]: I1125 09:06:09.096190 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:09 crc kubenswrapper[4565]: E1125 09:06:09.096254 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:09 crc kubenswrapper[4565]: I1125 09:06:09.096168 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:09 crc kubenswrapper[4565]: I1125 09:06:09.096634 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:09 crc kubenswrapper[4565]: E1125 09:06:09.096747 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:09 crc kubenswrapper[4565]: E1125 09:06:09.098091 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:09 crc kubenswrapper[4565]: I1125 09:06:09.098731 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:06:09 crc kubenswrapper[4565]: E1125 09:06:09.098870 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vk74d_openshift-ovn-kubernetes(23e95c48-8d61-4222-a968-b86203ef8aab)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" Nov 25 09:06:09 crc kubenswrapper[4565]: E1125 09:06:09.099045 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:11 crc kubenswrapper[4565]: I1125 09:06:11.096565 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:11 crc kubenswrapper[4565]: I1125 09:06:11.096950 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:11 crc kubenswrapper[4565]: I1125 09:06:11.096990 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:11 crc kubenswrapper[4565]: E1125 09:06:11.097053 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:11 crc kubenswrapper[4565]: E1125 09:06:11.097129 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:11 crc kubenswrapper[4565]: I1125 09:06:11.097165 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:11 crc kubenswrapper[4565]: E1125 09:06:11.097227 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:11 crc kubenswrapper[4565]: E1125 09:06:11.097264 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:13 crc kubenswrapper[4565]: I1125 09:06:13.096391 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:13 crc kubenswrapper[4565]: E1125 09:06:13.096877 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:13 crc kubenswrapper[4565]: I1125 09:06:13.096473 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:13 crc kubenswrapper[4565]: E1125 09:06:13.097157 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:13 crc kubenswrapper[4565]: I1125 09:06:13.096446 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:13 crc kubenswrapper[4565]: E1125 09:06:13.097328 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:13 crc kubenswrapper[4565]: I1125 09:06:13.096505 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:13 crc kubenswrapper[4565]: E1125 09:06:13.097490 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:15 crc kubenswrapper[4565]: I1125 09:06:15.096422 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:15 crc kubenswrapper[4565]: I1125 09:06:15.096484 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:15 crc kubenswrapper[4565]: I1125 09:06:15.096530 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:15 crc kubenswrapper[4565]: E1125 09:06:15.096605 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:15 crc kubenswrapper[4565]: E1125 09:06:15.096694 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:15 crc kubenswrapper[4565]: E1125 09:06:15.096764 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:15 crc kubenswrapper[4565]: I1125 09:06:15.097020 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:15 crc kubenswrapper[4565]: E1125 09:06:15.097092 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:17 crc kubenswrapper[4565]: I1125 09:06:17.097141 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:17 crc kubenswrapper[4565]: I1125 09:06:17.097221 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:17 crc kubenswrapper[4565]: E1125 09:06:17.098403 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:17 crc kubenswrapper[4565]: I1125 09:06:17.098417 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:17 crc kubenswrapper[4565]: I1125 09:06:17.098426 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:17 crc kubenswrapper[4565]: E1125 09:06:17.098467 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:17 crc kubenswrapper[4565]: E1125 09:06:17.098610 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:17 crc kubenswrapper[4565]: E1125 09:06:17.098829 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.096614 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.096638 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.096716 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:19 crc kubenswrapper[4565]: E1125 09:06:19.096709 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.096751 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:19 crc kubenswrapper[4565]: E1125 09:06:19.096817 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:19 crc kubenswrapper[4565]: E1125 09:06:19.096868 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:19 crc kubenswrapper[4565]: E1125 09:06:19.096909 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.495757 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/1.log" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.496128 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/0.log" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.496162 4565 generic.go:334] "Generic (PLEG): container finished" podID="6d96c20a-2514-47cf-99ec-a314bacac513" containerID="a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e" exitCode=1 Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.496183 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerDied","Data":"a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e"} Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.496206 4565 scope.go:117] "RemoveContainer" containerID="a3713c6aa09bd9f93d8584cd6f30944a42328b702cd1fc25409d41b92e8100e9" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.496430 4565 scope.go:117] "RemoveContainer" containerID="a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e" Nov 25 09:06:19 crc kubenswrapper[4565]: E1125 09:06:19.496595 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jpfp5_openshift-multus(6d96c20a-2514-47cf-99ec-a314bacac513)\"" pod="openshift-multus/multus-jpfp5" podUID="6d96c20a-2514-47cf-99ec-a314bacac513" Nov 25 09:06:19 crc kubenswrapper[4565]: I1125 09:06:19.508464 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4clc8" podStartSLOduration=95.508453227 podStartE2EDuration="1m35.508453227s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:00.462022421 +0000 UTC m=+93.664517559" watchObservedRunningTime="2025-11-25 09:06:19.508453227 +0000 UTC m=+112.710948365" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.097415 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.499723 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/1.log" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.501519 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/3.log" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.503434 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerStarted","Data":"446386ecc8985c225115f5f5270aa4e5dc6f01c72c0e5763b832e06920890368"} Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.503761 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.521854 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podStartSLOduration=95.521844208 podStartE2EDuration="1m35.521844208s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:20.521285899 +0000 UTC m=+113.723781036" watchObservedRunningTime="2025-11-25 09:06:20.521844208 +0000 UTC m=+113.724339346" Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.674068 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fzpzk"] Nov 25 09:06:20 crc kubenswrapper[4565]: I1125 09:06:20.674154 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:20 crc kubenswrapper[4565]: E1125 09:06:20.674223 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:21 crc kubenswrapper[4565]: I1125 09:06:21.097171 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:21 crc kubenswrapper[4565]: I1125 09:06:21.097172 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:21 crc kubenswrapper[4565]: I1125 09:06:21.097258 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:21 crc kubenswrapper[4565]: E1125 09:06:21.097350 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:21 crc kubenswrapper[4565]: E1125 09:06:21.097516 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:21 crc kubenswrapper[4565]: E1125 09:06:21.097606 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:23 crc kubenswrapper[4565]: I1125 09:06:23.096181 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:23 crc kubenswrapper[4565]: I1125 09:06:23.096218 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:23 crc kubenswrapper[4565]: I1125 09:06:23.096238 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:23 crc kubenswrapper[4565]: E1125 09:06:23.096270 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:23 crc kubenswrapper[4565]: I1125 09:06:23.096319 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:23 crc kubenswrapper[4565]: E1125 09:06:23.096436 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:23 crc kubenswrapper[4565]: E1125 09:06:23.096499 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:23 crc kubenswrapper[4565]: E1125 09:06:23.096552 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:25 crc kubenswrapper[4565]: I1125 09:06:25.096439 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:25 crc kubenswrapper[4565]: I1125 09:06:25.096510 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:25 crc kubenswrapper[4565]: I1125 09:06:25.096455 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:25 crc kubenswrapper[4565]: E1125 09:06:25.096536 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:25 crc kubenswrapper[4565]: I1125 09:06:25.096544 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:25 crc kubenswrapper[4565]: E1125 09:06:25.096605 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:25 crc kubenswrapper[4565]: E1125 09:06:25.096678 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:25 crc kubenswrapper[4565]: E1125 09:06:25.096730 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.089972 4565 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 25 09:06:27 crc kubenswrapper[4565]: I1125 09:06:27.096708 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:27 crc kubenswrapper[4565]: I1125 09:06:27.096777 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:27 crc kubenswrapper[4565]: I1125 09:06:27.096830 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:27 crc kubenswrapper[4565]: I1125 09:06:27.096910 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.097547 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.097656 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.097733 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.097780 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:27 crc kubenswrapper[4565]: E1125 09:06:27.173699 4565 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 09:06:28 crc kubenswrapper[4565]: I1125 09:06:28.503971 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:06:29 crc kubenswrapper[4565]: I1125 09:06:29.096218 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:29 crc kubenswrapper[4565]: I1125 09:06:29.096291 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:29 crc kubenswrapper[4565]: I1125 09:06:29.096310 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:29 crc kubenswrapper[4565]: E1125 09:06:29.096325 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:29 crc kubenswrapper[4565]: E1125 09:06:29.096431 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:29 crc kubenswrapper[4565]: I1125 09:06:29.096457 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:29 crc kubenswrapper[4565]: E1125 09:06:29.096542 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:29 crc kubenswrapper[4565]: E1125 09:06:29.096602 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:31 crc kubenswrapper[4565]: I1125 09:06:31.096753 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:31 crc kubenswrapper[4565]: E1125 09:06:31.096852 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:31 crc kubenswrapper[4565]: I1125 09:06:31.096757 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:31 crc kubenswrapper[4565]: I1125 09:06:31.096957 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:31 crc kubenswrapper[4565]: E1125 09:06:31.096978 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:31 crc kubenswrapper[4565]: E1125 09:06:31.097068 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:31 crc kubenswrapper[4565]: I1125 09:06:31.097085 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:31 crc kubenswrapper[4565]: E1125 09:06:31.097134 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:32 crc kubenswrapper[4565]: I1125 09:06:32.097148 4565 scope.go:117] "RemoveContainer" containerID="a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e" Nov 25 09:06:32 crc kubenswrapper[4565]: E1125 09:06:32.174592 4565 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 09:06:32 crc kubenswrapper[4565]: I1125 09:06:32.531553 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/1.log" Nov 25 09:06:32 crc kubenswrapper[4565]: I1125 09:06:32.531592 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerStarted","Data":"1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1"} Nov 25 09:06:33 crc kubenswrapper[4565]: I1125 09:06:33.096654 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:33 crc kubenswrapper[4565]: I1125 09:06:33.096695 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:33 crc kubenswrapper[4565]: I1125 09:06:33.096808 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:33 crc kubenswrapper[4565]: E1125 09:06:33.096784 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:33 crc kubenswrapper[4565]: E1125 09:06:33.096864 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:33 crc kubenswrapper[4565]: E1125 09:06:33.096972 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:33 crc kubenswrapper[4565]: I1125 09:06:33.097145 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:33 crc kubenswrapper[4565]: E1125 09:06:33.097323 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:35 crc kubenswrapper[4565]: I1125 09:06:35.097044 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:35 crc kubenswrapper[4565]: I1125 09:06:35.097068 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:35 crc kubenswrapper[4565]: I1125 09:06:35.097111 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:35 crc kubenswrapper[4565]: I1125 09:06:35.097151 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:35 crc kubenswrapper[4565]: E1125 09:06:35.097161 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:35 crc kubenswrapper[4565]: E1125 09:06:35.097270 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:35 crc kubenswrapper[4565]: E1125 09:06:35.097310 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:35 crc kubenswrapper[4565]: E1125 09:06:35.097359 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:37 crc kubenswrapper[4565]: I1125 09:06:37.096645 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:37 crc kubenswrapper[4565]: I1125 09:06:37.096662 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:37 crc kubenswrapper[4565]: I1125 09:06:37.096676 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:37 crc kubenswrapper[4565]: I1125 09:06:37.096675 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:37 crc kubenswrapper[4565]: E1125 09:06:37.101502 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fzpzk" podUID="b5b047b2-31c7-45e7-a944-8d9c6de61061" Nov 25 09:06:37 crc kubenswrapper[4565]: E1125 09:06:37.101580 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 09:06:37 crc kubenswrapper[4565]: E1125 09:06:37.101667 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 09:06:37 crc kubenswrapper[4565]: E1125 09:06:37.101732 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.096565 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.096603 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.096626 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.096634 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099588 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099605 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099616 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099623 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099702 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 09:06:39 crc kubenswrapper[4565]: I1125 09:06:39.099738 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.529186 4565 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.554964 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pgjmj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.557767 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.557900 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.558149 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wbp5x"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.558457 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.558816 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.559155 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-926vb"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.560101 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.560325 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.564995 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.565553 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.565877 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.566189 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjbbz"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.566409 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.567095 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.567309 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.567335 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.567551 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.568027 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vvqqw"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.568392 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.571328 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576664 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576730 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576737 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576773 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576818 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576838 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576866 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576910 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576946 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576965 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.576865 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577015 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577046 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577120 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577585 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577835 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6mnps"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577632 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577688 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.577955 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.578277 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.578475 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.578778 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:40 crc kubenswrapper[4565]: W1125 09:06:40.579101 4565 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert": failed to list *v1.Secret: secrets "openshift-apiserver-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Nov 25 09:06:40 crc kubenswrapper[4565]: E1125 09:06:40.579131 4565 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.579746 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581028 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581170 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581287 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581411 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581776 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.581892 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.582014 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.584713 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.584852 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585062 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585289 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585399 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585514 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585635 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585755 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.585870 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.586036 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.587012 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.587321 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.587566 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.588472 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.588586 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.588691 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.588794 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.588911 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593173 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593372 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593394 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593466 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593516 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593519 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.591962 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.591995 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.592036 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.592297 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.592395 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593031 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.593063 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.591663 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.594097 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.596054 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.596118 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.596062 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.596240 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.596268 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.598827 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.599168 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.599575 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.599664 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.599749 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.599664 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.600632 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.600971 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.601215 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.601470 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.601528 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.603302 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.605008 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.605141 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.608133 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.608755 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.609749 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8xkx9"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.610713 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.621867 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.622136 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.623769 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624058 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624122 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624225 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624319 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624066 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624771 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.624909 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.625169 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.625356 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.625581 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.626188 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.626672 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4bqts"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.627040 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.627314 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.627555 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.628073 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.628345 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.628597 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.628704 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634006 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634154 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634362 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634548 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634718 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.634558 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.635007 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.636036 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h5ktx"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.636366 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.636381 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.636394 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.636812 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.637677 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.638028 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.638043 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.638324 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.638361 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.639350 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.639457 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.639876 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.640055 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.641882 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.645023 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.645540 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2fq46"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.646359 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.646761 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.647768 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.649193 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.649218 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.649458 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.658499 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.663458 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.663826 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.669588 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.670128 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q5drb"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.670617 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.670840 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.671235 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.671598 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.671769 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672420 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp5cw\" (UniqueName: \"kubernetes.io/projected/9ba89570-d3db-4eb1-a768-a07858224030-kube-api-access-fp5cw\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672449 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7v8n\" (UniqueName: \"kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672472 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672493 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-trusted-ca\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672520 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba89570-d3db-4eb1-a768-a07858224030-serving-cert\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672547 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672564 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672581 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672599 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-config\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672614 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngkb\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-kube-api-access-sngkb\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672652 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672668 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzts8\" (UniqueName: \"kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672683 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672699 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4792f05e-1c4a-4c87-991d-e26851501f52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672731 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672758 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672776 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672790 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.672813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4792f05e-1c4a-4c87-991d-e26851501f52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.674740 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8x25t"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.676197 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.676998 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c567c"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.678572 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.690083 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.690236 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.691300 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.691586 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wmz6h"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.692266 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.692287 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.692346 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.692505 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.692674 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.694881 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjbbz"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.694903 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pgjmj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.694913 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.696130 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.696727 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.699181 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8xkx9"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.699510 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.700053 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vvqqw"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.702345 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.704050 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.705766 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.707817 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wbp5x"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.709155 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.710210 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.711578 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.714973 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.716085 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2fq46"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.717135 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.721114 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6mnps"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.721138 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.721148 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.722936 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.723038 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.723284 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.724727 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.725650 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.726106 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.727682 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4bqts"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.727953 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.728696 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.729616 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c567c"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.730782 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.731369 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8x25t"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.732161 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-crrxv"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.733370 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bdn5h"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.733449 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.733945 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.733987 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.736050 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wmz6h"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.736074 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.736084 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.737956 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q5drb"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.737983 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bdn5h"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.738607 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-crrxv"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.740409 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.759619 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773288 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773313 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773332 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4792f05e-1c4a-4c87-991d-e26851501f52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773353 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp5cw\" (UniqueName: \"kubernetes.io/projected/9ba89570-d3db-4eb1-a768-a07858224030-kube-api-access-fp5cw\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773368 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7v8n\" (UniqueName: \"kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773384 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773399 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-trusted-ca\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.773864 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba89570-d3db-4eb1-a768-a07858224030-serving-cert\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775746 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775785 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775830 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775867 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-config\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775912 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngkb\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-kube-api-access-sngkb\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.775991 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.776286 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzts8\" (UniqueName: \"kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.776347 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.776369 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4792f05e-1c4a-4c87-991d-e26851501f52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.776419 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.776457 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.778362 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.778924 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.779255 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-config\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.780076 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4792f05e-1c4a-4c87-991d-e26851501f52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.780698 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ba89570-d3db-4eb1-a768-a07858224030-trusted-ca\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.780941 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.782992 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4792f05e-1c4a-4c87-991d-e26851501f52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.783566 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.784438 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-d6pbv"] Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.786015 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.788027 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba89570-d3db-4eb1-a768-a07858224030-serving-cert\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.799392 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.819509 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.840320 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.864287 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.869979 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.879339 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.881501 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.899702 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.919514 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.921779 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.940166 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.950882 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.959188 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.967527 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.979594 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 09:06:40 crc kubenswrapper[4565]: I1125 09:06:40.989002 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.019875 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.040471 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.060109 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.079378 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.099323 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.119550 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.140329 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.159674 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.180258 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.199406 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.219282 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.239597 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.260173 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.279423 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.299167 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.319372 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.339820 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.359969 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.380031 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.399857 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.420251 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.439593 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.460025 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.479086 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.499413 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.520056 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.539224 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.559989 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.579815 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.599819 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.619259 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.639237 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.658456 4565 request.go:700] Waited for 1.011535543s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.659202 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.679519 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.719721 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.740010 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.760302 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.779459 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.800024 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.819587 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.839729 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.859648 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.879351 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.899274 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.919223 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.939303 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.959441 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 09:06:41 crc kubenswrapper[4565]: I1125 09:06:41.980114 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.000105 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.019487 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.039860 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.060048 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.079087 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.099401 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.119121 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.139296 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.159316 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.179797 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.200153 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.219118 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.239453 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.259671 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.280096 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.300312 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.323378 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.339274 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.359632 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.379710 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.399983 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.419578 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.439602 4565 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.459962 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.480143 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.500395 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.519453 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.539472 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.571672 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzts8\" (UniqueName: \"kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8\") pod \"console-f9d7485db-wvc8w\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.582434 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.591117 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngkb\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-kube-api-access-sngkb\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.610706 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7v8n\" (UniqueName: \"kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n\") pod \"route-controller-manager-6576b87f9c-2v7lr\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.634361 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4792f05e-1c4a-4c87-991d-e26851501f52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ls4wr\" (UID: \"4792f05e-1c4a-4c87-991d-e26851501f52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.651694 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp5cw\" (UniqueName: \"kubernetes.io/projected/9ba89570-d3db-4eb1-a768-a07858224030-kube-api-access-fp5cw\") pod \"console-operator-58897d9998-6mnps\" (UID: \"9ba89570-d3db-4eb1-a768-a07858224030\") " pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.659451 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.678487 4565 request.go:700] Waited for 1.892193681s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.679759 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.689099 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.699593 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.709894 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.760364 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791454 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791487 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791506 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-service-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791521 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gv5p\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-kube-api-access-5gv5p\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791537 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918a1a54-550d-4e79-b0c9-360d88fd8a6e-serving-cert\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791551 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nfc\" (UniqueName: \"kubernetes.io/projected/d2b8d9fc-1eab-46f7-a214-59255f20c0db-kube-api-access-l5nfc\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791566 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791578 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791593 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-node-pullsecrets\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791607 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791623 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791664 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4vsc\" (UniqueName: \"kubernetes.io/projected/394e1c70-96f6-4655-bb1a-50850d43136e-kube-api-access-b4vsc\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791694 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-config\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791719 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf1227-407e-41f6-bf75-d755ecfd724f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791764 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/262e9d06-4e95-4017-85d9-5657f520eb49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791796 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-images\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791814 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791828 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791855 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791872 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-config\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791885 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-serving-cert\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791897 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-etcd-client\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791909 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791962 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-service-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.791989 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25752784-0e60-4517-ba99-754385fa0ecb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792006 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792022 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792038 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bebd549b-749f-4e30-ab21-99c32a85c0ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792052 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792065 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-audit\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792078 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792095 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792108 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792124 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792141 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792154 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-image-import-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792169 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b8d9fc-1eab-46f7-a214-59255f20c0db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792186 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf1227-407e-41f6-bf75-d755ecfd724f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792199 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-serving-cert\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792214 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnp4z\" (UniqueName: \"kubernetes.io/projected/0587232c-6c1f-44d8-b7d0-be44d147bd71-kube-api-access-lnp4z\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792229 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792251 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9sjf\" (UniqueName: \"kubernetes.io/projected/918a1a54-550d-4e79-b0c9-360d88fd8a6e-kube-api-access-p9sjf\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792264 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nsft\" (UniqueName: \"kubernetes.io/projected/6becb00b-a391-46f2-b821-c626b7903924-kube-api-access-5nsft\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792278 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792291 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792304 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792318 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-auth-proxy-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792334 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-machine-approver-tls\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792349 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792365 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebcf1227-407e-41f6-bf75-d755ecfd724f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792429 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gvw\" (UniqueName: \"kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792445 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvtl2\" (UniqueName: \"kubernetes.io/projected/262e9d06-4e95-4017-85d9-5657f520eb49-kube-api-access-kvtl2\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792476 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-audit-dir\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792524 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792544 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792564 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-client\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792601 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bebd549b-749f-4e30-ab21-99c32a85c0ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792617 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792631 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792660 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792677 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0587232c-6c1f-44d8-b7d0-be44d147bd71-metrics-tls\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792691 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4841ec62-bf33-407a-b240-1409e85aa22e-metrics-tls\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792706 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25752784-0e60-4517-ba99-754385fa0ecb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: E1125 09:06:42.792723 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.292714054 +0000 UTC m=+136.495209192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792743 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8mn\" (UniqueName: \"kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792768 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b8d9fc-1eab-46f7-a214-59255f20c0db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792781 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792795 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7km\" (UniqueName: \"kubernetes.io/projected/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-kube-api-access-hl7km\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792808 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-config\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792822 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-etcd-serving-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792833 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-config\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792854 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfsm2\" (UniqueName: \"kubernetes.io/projected/cd73c816-12d9-42bc-bc10-2180ed13b36a-kube-api-access-lfsm2\") pod \"downloads-7954f5f757-wbp5x\" (UID: \"cd73c816-12d9-42bc-bc10-2180ed13b36a\") " pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792867 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792879 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792892 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnsx\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.792905 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4841ec62-bf33-407a-b240-1409e85aa22e-trusted-ca\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.793586 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzsh\" (UniqueName: \"kubernetes.io/projected/25752784-0e60-4517-ba99-754385fa0ecb-kube-api-access-4tzsh\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.793667 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7294c\" (UniqueName: \"kubernetes.io/projected/bebd549b-749f-4e30-ab21-99c32a85c0ca-kube-api-access-7294c\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.793726 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-encryption-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.793746 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.810987 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6mnps"] Nov 25 09:06:42 crc kubenswrapper[4565]: W1125 09:06:42.816405 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba89570_d3db_4eb1_a768_a07858224030.slice/crio-6be8d54f6ae82039df7a780c42d299e6bb619f6833293f03c948f98463d6ad07 WatchSource:0}: Error finding container 6be8d54f6ae82039df7a780c42d299e6bb619f6833293f03c948f98463d6ad07: Status 404 returned error can't find the container with id 6be8d54f6ae82039df7a780c42d299e6bb619f6833293f03c948f98463d6ad07 Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.834137 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.849452 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894522 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:42 crc kubenswrapper[4565]: E1125 09:06:42.894678 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.394657832 +0000 UTC m=+136.597152970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894719 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6dm\" (UniqueName: \"kubernetes.io/projected/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-kube-api-access-zb6dm\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894742 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66cd3426-ad3a-499d-8b09-056348e1413a-service-ca-bundle\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894768 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-node-pullsecrets\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894794 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894808 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4vsc\" (UniqueName: \"kubernetes.io/projected/394e1c70-96f6-4655-bb1a-50850d43136e-kube-api-access-b4vsc\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894822 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-config\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894848 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf1227-407e-41f6-bf75-d755ecfd724f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894905 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgp2s\" (UniqueName: \"kubernetes.io/projected/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-kube-api-access-mgp2s\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894950 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894975 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgs7\" (UniqueName: \"kubernetes.io/projected/12c5c094-a435-4607-9e17-82b36d40c672-kube-api-access-qwgs7\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.894999 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-images\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895016 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895031 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-stats-auth\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895463 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895501 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-node-pullsecrets\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895531 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895548 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-config\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895550 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-config\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895610 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-serving-cert\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895639 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895659 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf1227-407e-41f6-bf75-d755ecfd724f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895667 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895712 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895732 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bebd549b-749f-4e30-ab21-99c32a85c0ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895757 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-dir\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.895772 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-registration-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896020 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-config\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896134 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896398 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bebd549b-749f-4e30-ab21-99c32a85c0ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896542 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896611 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896635 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29p88\" (UniqueName: \"kubernetes.io/projected/30d6cc0f-1d5e-4675-9e42-69c79794ff83-kube-api-access-29p88\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896678 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896694 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896714 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896710 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kzl\" (UniqueName: \"kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896949 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/262e9d06-4e95-4017-85d9-5657f520eb49-images\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.896995 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b8d9fc-1eab-46f7-a214-59255f20c0db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897016 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-srv-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897051 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-apiservice-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897066 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-config\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897086 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnp4z\" (UniqueName: \"kubernetes.io/projected/0587232c-6c1f-44d8-b7d0-be44d147bd71-kube-api-access-lnp4z\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897121 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-proxy-tls\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897139 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897155 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-serving-cert\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897486 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897699 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9sjf\" (UniqueName: \"kubernetes.io/projected/918a1a54-550d-4e79-b0c9-360d88fd8a6e-kube-api-access-p9sjf\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897741 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkpkl\" (UniqueName: \"kubernetes.io/projected/66cd3426-ad3a-499d-8b09-056348e1413a-kube-api-access-mkpkl\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897758 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3193422a-c236-406b-a51a-e2865b720ff4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897784 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebcf1227-407e-41f6-bf75-d755ecfd724f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897903 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897921 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvtl2\" (UniqueName: \"kubernetes.io/projected/262e9d06-4e95-4017-85d9-5657f520eb49-kube-api-access-kvtl2\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.897983 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bebd549b-749f-4e30-ab21-99c32a85c0ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898012 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-metrics-certs\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898041 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898101 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898132 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898148 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0587232c-6c1f-44d8-b7d0-be44d147bd71-metrics-tls\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898166 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898182 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7km\" (UniqueName: \"kubernetes.io/projected/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-kube-api-access-hl7km\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898197 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25752784-0e60-4517-ba99-754385fa0ecb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898212 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8mn\" (UniqueName: \"kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898226 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74651f90-e9e9-4c93-a929-26765c304243-signing-key\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898253 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-etcd-serving-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898268 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-config\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898284 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-node-bootstrap-token\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898297 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74651f90-e9e9-4c93-a929-26765c304243-signing-cabundle\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898312 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlt8\" (UniqueName: \"kubernetes.io/projected/4b598734-79f8-4a64-9f30-e08bd6c56bc7-kube-api-access-hvlt8\") pod \"migrator-59844c95c7-tj8jh\" (UID: \"4b598734-79f8-4a64-9f30-e08bd6c56bc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898338 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfsm2\" (UniqueName: \"kubernetes.io/projected/cd73c816-12d9-42bc-bc10-2180ed13b36a-kube-api-access-lfsm2\") pod \"downloads-7954f5f757-wbp5x\" (UID: \"cd73c816-12d9-42bc-bc10-2180ed13b36a\") " pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898366 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnsx\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898382 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4841ec62-bf33-407a-b240-1409e85aa22e-trusted-ca\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898397 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gtf\" (UniqueName: \"kubernetes.io/projected/96805dde-4050-4b05-a10b-e7a1f1ed37c7-kube-api-access-28gtf\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898413 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-encryption-config\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898428 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszf2\" (UniqueName: \"kubernetes.io/projected/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-kube-api-access-wszf2\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898445 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-encryption-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898461 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898476 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-kube-api-access-4z6x5\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898490 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96805dde-4050-4b05-a10b-e7a1f1ed37c7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898503 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6148f932-f59f-4a65-8094-84fa424d40bb-cert\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898520 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898535 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-metrics-tls\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898553 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898573 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nfc\" (UniqueName: \"kubernetes.io/projected/d2b8d9fc-1eab-46f7-a214-59255f20c0db-kube-api-access-l5nfc\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898587 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918a1a54-550d-4e79-b0c9-360d88fd8a6e-serving-cert\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898600 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-plugins-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898615 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898628 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898642 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk9h\" (UniqueName: \"kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898666 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.898892 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899077 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899401 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899448 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-certs\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899489 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-webhook-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899508 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-policies\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899535 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/262e9d06-4e95-4017-85d9-5657f520eb49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899551 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrwl\" (UniqueName: \"kubernetes.io/projected/74651f90-e9e9-4c93-a929-26765c304243-kube-api-access-qrrwl\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899567 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899581 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899596 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23670-fc21-4730-a27e-ac490261f994-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899613 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899627 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-serving-cert\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899643 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-etcd-client\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899657 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899670 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-service-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899686 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25752784-0e60-4517-ba99-754385fa0ecb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899702 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc24q\" (UniqueName: \"kubernetes.io/projected/3193422a-c236-406b-a51a-e2865b720ff4-kube-api-access-qc24q\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899717 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46d9\" (UniqueName: \"kubernetes.io/projected/66eb132d-fe22-490c-bd19-3c52de3b56ee-kube-api-access-d46d9\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899744 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899757 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-audit\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899771 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdnw\" (UniqueName: \"kubernetes.io/projected/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-kube-api-access-9bdnw\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899783 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-config-volume\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899799 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.901788 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-etcd-serving-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.902222 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-config\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903033 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4841ec62-bf33-407a-b240-1409e85aa22e-trusted-ca\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903097 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-image-import-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903122 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903144 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903169 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-socket-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903190 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf1227-407e-41f6-bf75-d755ecfd724f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903208 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-serving-cert\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903227 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903245 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-csi-data-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903261 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmh4m\" (UniqueName: \"kubernetes.io/projected/6148f932-f59f-4a65-8094-84fa424d40bb-kube-api-access-mmh4m\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903300 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nsft\" (UniqueName: \"kubernetes.io/projected/6becb00b-a391-46f2-b821-c626b7903924-kube-api-access-5nsft\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903319 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903337 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903353 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903373 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-auth-proxy-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903391 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-machine-approver-tls\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903409 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903426 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-srv-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903461 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtdf9\" (UniqueName: \"kubernetes.io/projected/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-kube-api-access-wtdf9\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903480 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-audit-dir\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903497 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gvw\" (UniqueName: \"kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903515 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-default-certificate\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903534 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-client\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903553 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf5dc\" (UniqueName: \"kubernetes.io/projected/00c23670-fc21-4730-a27e-ac490261f994-kube-api-access-sf5dc\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903599 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903619 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903636 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-client\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903651 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903669 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903704 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903719 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903737 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtvbv\" (UniqueName: \"kubernetes.io/projected/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-kube-api-access-gtvbv\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903758 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b8d9fc-1eab-46f7-a214-59255f20c0db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903775 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4841ec62-bf33-407a-b240-1409e85aa22e-metrics-tls\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903790 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903811 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5j5\" (UniqueName: \"kubernetes.io/projected/f49defba-4d11-4d11-8a4f-d5fcbe187c73-kube-api-access-vm5j5\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903828 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-mountpoint-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903865 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903883 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903898 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-config\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903919 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzsh\" (UniqueName: \"kubernetes.io/projected/25752784-0e60-4517-ba99-754385fa0ecb-kube-api-access-4tzsh\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903955 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7294c\" (UniqueName: \"kubernetes.io/projected/bebd549b-749f-4e30-ab21-99c32a85c0ca-kube-api-access-7294c\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.903977 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfmvd\" (UniqueName: \"kubernetes.io/projected/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-kube-api-access-mfmvd\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904007 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904063 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904082 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-service-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904100 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/66eb132d-fe22-490c-bd19-3c52de3b56ee-tmpfs\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904118 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-config\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904135 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-images\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904156 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gv5p\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-kube-api-access-5gv5p\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.904173 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-proxy-tls\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.905261 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25752784-0e60-4517-ba99-754385fa0ecb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.905676 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bebd549b-749f-4e30-ab21-99c32a85c0ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.906032 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0587232c-6c1f-44d8-b7d0-be44d147bd71-metrics-tls\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.906355 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b8d9fc-1eab-46f7-a214-59255f20c0db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.911444 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-config\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.913005 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-auth-proxy-config\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.913112 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.913243 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.913789 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.913801 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.914286 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf1227-407e-41f6-bf75-d755ecfd724f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: E1125 09:06:42.915311 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.415294469 +0000 UTC m=+136.617789607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.917739 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.899488 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.922265 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.922903 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.922948 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.923333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/918a1a54-550d-4e79-b0c9-360d88fd8a6e-service-ca-bundle\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.923676 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-serving-cert\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.924591 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b8d9fc-1eab-46f7-a214-59255f20c0db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.924820 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.924905 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.925105 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.925344 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-serving-cert\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.925652 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.926030 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/262e9d06-4e95-4017-85d9-5657f520eb49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.926617 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.926669 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918a1a54-550d-4e79-b0c9-360d88fd8a6e-serving-cert\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.926908 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-service-ca\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.927070 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-audit\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.927113 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6becb00b-a391-46f2-b821-c626b7903924-audit-dir\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.927511 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.927974 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.928241 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6becb00b-a391-46f2-b821-c626b7903924-image-import-ca\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.929318 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-machine-approver-tls\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.929339 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/394e1c70-96f6-4655-bb1a-50850d43136e-etcd-client\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.929881 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.929945 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-etcd-client\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.930443 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6becb00b-a391-46f2-b821-c626b7903924-encryption-config\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.930911 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.931044 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.932500 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4841ec62-bf33-407a-b240-1409e85aa22e-metrics-tls\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.932499 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.933311 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4vsc\" (UniqueName: \"kubernetes.io/projected/394e1c70-96f6-4655-bb1a-50850d43136e-kube-api-access-b4vsc\") pod \"etcd-operator-b45778765-kjbbz\" (UID: \"394e1c70-96f6-4655-bb1a-50850d43136e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.933708 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25752784-0e60-4517-ba99-754385fa0ecb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.951457 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnp4z\" (UniqueName: \"kubernetes.io/projected/0587232c-6c1f-44d8-b7d0-be44d147bd71-kube-api-access-lnp4z\") pod \"dns-operator-744455d44c-4bqts\" (UID: \"0587232c-6c1f-44d8-b7d0-be44d147bd71\") " pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.970125 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.977443 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebcf1227-407e-41f6-bf75-d755ecfd724f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-72h5b\" (UID: \"ebcf1227-407e-41f6-bf75-d755ecfd724f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.990916 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9sjf\" (UniqueName: \"kubernetes.io/projected/918a1a54-550d-4e79-b0c9-360d88fd8a6e-kube-api-access-p9sjf\") pod \"authentication-operator-69f744f599-pgjmj\" (UID: \"918a1a54-550d-4e79-b0c9-360d88fd8a6e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:42 crc kubenswrapper[4565]: I1125 09:06:42.991662 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr"] Nov 25 09:06:43 crc kubenswrapper[4565]: W1125 09:06:43.002285 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4792f05e_1c4a_4c87_991d_e26851501f52.slice/crio-abf3474173968dff3231789803664f976480bf7c34d7ec4351be24b22df64ac2 WatchSource:0}: Error finding container abf3474173968dff3231789803664f976480bf7c34d7ec4351be24b22df64ac2: Status 404 returned error can't find the container with id abf3474173968dff3231789803664f976480bf7c34d7ec4351be24b22df64ac2 Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005113 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005275 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkpkl\" (UniqueName: \"kubernetes.io/projected/66cd3426-ad3a-499d-8b09-056348e1413a-kube-api-access-mkpkl\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005303 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3193422a-c236-406b-a51a-e2865b720ff4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005328 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.005351 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.505336101 +0000 UTC m=+136.707831240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005379 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005412 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-metrics-certs\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005428 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005457 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74651f90-e9e9-4c93-a929-26765c304243-signing-key\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005472 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-node-bootstrap-token\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005485 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74651f90-e9e9-4c93-a929-26765c304243-signing-cabundle\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005500 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlt8\" (UniqueName: \"kubernetes.io/projected/4b598734-79f8-4a64-9f30-e08bd6c56bc7-kube-api-access-hvlt8\") pod \"migrator-59844c95c7-tj8jh\" (UID: \"4b598734-79f8-4a64-9f30-e08bd6c56bc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005533 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gtf\" (UniqueName: \"kubernetes.io/projected/96805dde-4050-4b05-a10b-e7a1f1ed37c7-kube-api-access-28gtf\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005548 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-encryption-config\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005562 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszf2\" (UniqueName: \"kubernetes.io/projected/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-kube-api-access-wszf2\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005577 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-kube-api-access-4z6x5\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005595 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96805dde-4050-4b05-a10b-e7a1f1ed37c7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005608 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6148f932-f59f-4a65-8094-84fa424d40bb-cert\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005623 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-metrics-tls\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005638 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005658 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-plugins-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005674 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk9h\" (UniqueName: \"kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005690 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005705 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-certs\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005718 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-webhook-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005731 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-policies\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005751 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrwl\" (UniqueName: \"kubernetes.io/projected/74651f90-e9e9-4c93-a929-26765c304243-kube-api-access-qrrwl\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005767 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005780 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005795 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23670-fc21-4730-a27e-ac490261f994-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005816 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc24q\" (UniqueName: \"kubernetes.io/projected/3193422a-c236-406b-a51a-e2865b720ff4-kube-api-access-qc24q\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005831 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46d9\" (UniqueName: \"kubernetes.io/projected/66eb132d-fe22-490c-bd19-3c52de3b56ee-kube-api-access-d46d9\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005856 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdnw\" (UniqueName: \"kubernetes.io/projected/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-kube-api-access-9bdnw\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005869 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-config-volume\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005884 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-socket-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005896 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005901 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005957 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-csi-data-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.005976 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmh4m\" (UniqueName: \"kubernetes.io/projected/6148f932-f59f-4a65-8094-84fa424d40bb-kube-api-access-mmh4m\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006000 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-srv-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006027 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtdf9\" (UniqueName: \"kubernetes.io/projected/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-kube-api-access-wtdf9\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006048 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-default-certificate\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006063 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-client\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006078 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf5dc\" (UniqueName: \"kubernetes.io/projected/00c23670-fc21-4730-a27e-ac490261f994-kube-api-access-sf5dc\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006094 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006109 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006125 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006146 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtvbv\" (UniqueName: \"kubernetes.io/projected/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-kube-api-access-gtvbv\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006162 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5j5\" (UniqueName: \"kubernetes.io/projected/f49defba-4d11-4d11-8a4f-d5fcbe187c73-kube-api-access-vm5j5\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006175 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-mountpoint-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006188 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006218 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfmvd\" (UniqueName: \"kubernetes.io/projected/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-kube-api-access-mfmvd\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006232 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006252 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/66eb132d-fe22-490c-bd19-3c52de3b56ee-tmpfs\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006266 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-images\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006281 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-config\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006300 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-proxy-tls\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006315 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66cd3426-ad3a-499d-8b09-056348e1413a-service-ca-bundle\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006329 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6dm\" (UniqueName: \"kubernetes.io/projected/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-kube-api-access-zb6dm\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006346 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgp2s\" (UniqueName: \"kubernetes.io/projected/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-kube-api-access-mgp2s\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006360 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006376 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgs7\" (UniqueName: \"kubernetes.io/projected/12c5c094-a435-4607-9e17-82b36d40c672-kube-api-access-qwgs7\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006390 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-stats-auth\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006405 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-serving-cert\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006420 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006442 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-dir\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006455 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-registration-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006470 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29p88\" (UniqueName: \"kubernetes.io/projected/30d6cc0f-1d5e-4675-9e42-69c79794ff83-kube-api-access-29p88\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006484 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006497 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kzl\" (UniqueName: \"kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006512 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-srv-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006524 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-apiservice-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006538 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-config\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006556 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-proxy-tls\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006571 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-serving-cert\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.006750 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.009060 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-serving-cert\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.009133 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-csi-data-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.009645 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-metrics-certs\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.011365 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-srv-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.011525 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74651f90-e9e9-4c93-a929-26765c304243-signing-key\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.011633 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74651f90-e9e9-4c93-a929-26765c304243-signing-cabundle\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012026 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-encryption-config\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012044 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-mountpoint-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012245 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-plugins-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012449 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-dir\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012545 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66cd3426-ad3a-499d-8b09-056348e1413a-service-ca-bundle\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.012592 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-registration-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013133 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-images\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013172 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3193422a-c236-406b-a51a-e2865b720ff4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013290 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013519 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/66eb132d-fe22-490c-bd19-3c52de3b56ee-tmpfs\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013756 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-config\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.013795 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-client\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.014425 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.514413667 +0000 UTC m=+136.716908805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.016022 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.016090 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.016333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-default-certificate\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.016960 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-metrics-tls\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.017340 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.017476 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvtl2\" (UniqueName: \"kubernetes.io/projected/262e9d06-4e95-4017-85d9-5657f520eb49-kube-api-access-kvtl2\") pod \"machine-api-operator-5694c8668f-8xkx9\" (UID: \"262e9d06-4e95-4017-85d9-5657f520eb49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.017733 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12c5c094-a435-4607-9e17-82b36d40c672-socket-dir\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.017919 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-config-volume\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.018704 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-config\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.018780 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-audit-policies\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.019330 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.019373 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/66cd3426-ad3a-499d-8b09-056348e1413a-stats-auth\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.019662 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.019888 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f49defba-4d11-4d11-8a4f-d5fcbe187c73-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.020403 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-apiservice-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.020404 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-srv-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.020779 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6148f932-f59f-4a65-8094-84fa424d40bb-cert\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.020979 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-proxy-tls\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.021364 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-node-bootstrap-token\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.021376 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.021365 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23670-fc21-4730-a27e-ac490261f994-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.021730 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30d6cc0f-1d5e-4675-9e42-69c79794ff83-certs\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.021734 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-serving-cert\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.022195 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.022201 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.022240 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.022517 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66eb132d-fe22-490c-bd19-3c52de3b56ee-webhook-cert\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.022582 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96805dde-4050-4b05-a10b-e7a1f1ed37c7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.024192 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.024831 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-proxy-tls\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.032712 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8mn\" (UniqueName: \"kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn\") pod \"controller-manager-879f6c89f-7lhlj\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.034959 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.051981 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfsm2\" (UniqueName: \"kubernetes.io/projected/cd73c816-12d9-42bc-bc10-2180ed13b36a-kube-api-access-lfsm2\") pod \"downloads-7954f5f757-wbp5x\" (UID: \"cd73c816-12d9-42bc-bc10-2180ed13b36a\") " pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.071517 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnsx\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.097786 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7km\" (UniqueName: \"kubernetes.io/projected/cf63e97f-f001-452c-8d17-1b8a4e40c3ae-kube-api-access-hl7km\") pod \"machine-approver-56656f9798-926vb\" (UID: \"cf63e97f-f001-452c-8d17-1b8a4e40c3ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.103198 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.110249 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.110342 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.6103225 +0000 UTC m=+136.812817638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.110566 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.110780 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.610772434 +0000 UTC m=+136.813267572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.114569 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nsft\" (UniqueName: \"kubernetes.io/projected/6becb00b-a391-46f2-b821-c626b7903924-kube-api-access-5nsft\") pod \"apiserver-76f77b778f-vvqqw\" (UID: \"6becb00b-a391-46f2-b821-c626b7903924\") " pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.124792 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.125538 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.132016 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzsh\" (UniqueName: \"kubernetes.io/projected/25752784-0e60-4517-ba99-754385fa0ecb-kube-api-access-4tzsh\") pod \"openshift-controller-manager-operator-756b6f6bc6-fr2kb\" (UID: \"25752784-0e60-4517-ba99-754385fa0ecb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.152381 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7294c\" (UniqueName: \"kubernetes.io/projected/bebd549b-749f-4e30-ab21-99c32a85c0ca-kube-api-access-7294c\") pod \"openshift-config-operator-7777fb866f-s7fbs\" (UID: \"bebd549b-749f-4e30-ab21-99c32a85c0ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.161087 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.169146 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.170864 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" Nov 25 09:06:43 crc kubenswrapper[4565]: W1125 09:06:43.176807 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcf1227_407e_41f6_bf75_d755ecfd724f.slice/crio-deac3d4befcf6cb201ccbbb2da09119b0df4d2eb875489757b992e51ecb03159 WatchSource:0}: Error finding container deac3d4befcf6cb201ccbbb2da09119b0df4d2eb875489757b992e51ecb03159: Status 404 returned error can't find the container with id deac3d4befcf6cb201ccbbb2da09119b0df4d2eb875489757b992e51ecb03159 Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.192765 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nfc\" (UniqueName: \"kubernetes.io/projected/d2b8d9fc-1eab-46f7-a214-59255f20c0db-kube-api-access-l5nfc\") pod \"openshift-apiserver-operator-796bbdcf4f-n4r7s\" (UID: \"d2b8d9fc-1eab-46f7-a214-59255f20c0db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.211134 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.211571 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.711558128 +0000 UTC m=+136.914053267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.215165 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.235857 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cc1943a-ad05-4d90-89c1-2e88e30d2c56-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r5s6w\" (UID: \"7cc1943a-ad05-4d90-89c1-2e88e30d2c56\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.257980 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.275712 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gv5p\" (UniqueName: \"kubernetes.io/projected/4841ec62-bf33-407a-b240-1409e85aa22e-kube-api-access-5gv5p\") pod \"ingress-operator-5b745b69d9-pzm4n\" (UID: \"4841ec62-bf33-407a-b240-1409e85aa22e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.278162 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.291530 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gvw\" (UniqueName: \"kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw\") pod \"oauth-openshift-558db77b4-nzk24\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.292696 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.312215 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.312450 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.812436858 +0000 UTC m=+137.014931996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.314457 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkpkl\" (UniqueName: \"kubernetes.io/projected/66cd3426-ad3a-499d-8b09-056348e1413a-kube-api-access-mkpkl\") pod \"router-default-5444994796-h5ktx\" (UID: \"66cd3426-ad3a-499d-8b09-056348e1413a\") " pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.334036 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.337212 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlt8\" (UniqueName: \"kubernetes.io/projected/4b598734-79f8-4a64-9f30-e08bd6c56bc7-kube-api-access-hvlt8\") pod \"migrator-59844c95c7-tj8jh\" (UID: \"4b598734-79f8-4a64-9f30-e08bd6c56bc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.357394 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmh4m\" (UniqueName: \"kubernetes.io/projected/6148f932-f59f-4a65-8094-84fa424d40bb-kube-api-access-mmh4m\") pod \"ingress-canary-wmz6h\" (UID: \"6148f932-f59f-4a65-8094-84fa424d40bb\") " pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.367761 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vvqqw"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.376564 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gtf\" (UniqueName: \"kubernetes.io/projected/96805dde-4050-4b05-a10b-e7a1f1ed37c7-kube-api-access-28gtf\") pod \"cluster-samples-operator-665b6dd947-57ghm\" (UID: \"96805dde-4050-4b05-a10b-e7a1f1ed37c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:43 crc kubenswrapper[4565]: W1125 09:06:43.378747 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6becb00b_a391_46f2_b821_c626b7903924.slice/crio-3910e6cf13f0693ae08ed44f27dbc49862667d3025a9e564fbd129dc6659758b WatchSource:0}: Error finding container 3910e6cf13f0693ae08ed44f27dbc49862667d3025a9e564fbd129dc6659758b: Status 404 returned error can't find the container with id 3910e6cf13f0693ae08ed44f27dbc49862667d3025a9e564fbd129dc6659758b Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.387377 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.391149 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtdf9\" (UniqueName: \"kubernetes.io/projected/86e733b7-6fa7-4086-bb82-95cc1b9dfa51-kube-api-access-wtdf9\") pod \"machine-config-controller-84d6567774-cgzl7\" (UID: \"86e733b7-6fa7-4086-bb82-95cc1b9dfa51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.394698 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8xkx9"] Nov 25 09:06:43 crc kubenswrapper[4565]: W1125 09:06:43.407612 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262e9d06_4e95_4017_85d9_5657f520eb49.slice/crio-455edfaa548c4d82ec29d331aa2ae7d14a6e3769691c2def943b2abf7b648d10 WatchSource:0}: Error finding container 455edfaa548c4d82ec29d331aa2ae7d14a6e3769691c2def943b2abf7b648d10: Status 404 returned error can't find the container with id 455edfaa548c4d82ec29d331aa2ae7d14a6e3769691c2def943b2abf7b648d10 Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.411075 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.413199 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.413643 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:43.913628796 +0000 UTC m=+137.116123934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.414585 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszf2\" (UniqueName: \"kubernetes.io/projected/befd0413-a4be-4b8f-9070-6a27ca4c2ca8-kube-api-access-wszf2\") pod \"service-ca-operator-777779d784-8x25t\" (UID: \"befd0413-a4be-4b8f-9070-6a27ca4c2ca8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.420432 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.440458 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4bqts"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.442512 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.446228 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/f1364197-953a-4e1b-ad5d-aff1a5f0f5ba-kube-api-access-4z6x5\") pod \"package-server-manager-789f6589d5-5c4hq\" (UID: \"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.456221 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.460582 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.462506 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk9h\" (UniqueName: \"kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h\") pod \"collect-profiles-29401020-4vwj8\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.480472 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6dm\" (UniqueName: \"kubernetes.io/projected/6dc4fc3b-ec76-4231-a2ed-d623be3502c0-kube-api-access-zb6dm\") pod \"kube-storage-version-migrator-operator-b67b599dd-gm2dl\" (UID: \"6dc4fc3b-ec76-4231-a2ed-d623be3502c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.501537 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgp2s\" (UniqueName: \"kubernetes.io/projected/f9857afe-8e2a-4ddd-8fc2-2059bff84d86-kube-api-access-mgp2s\") pod \"dns-default-bdn5h\" (UID: \"f9857afe-8e2a-4ddd-8fc2-2059bff84d86\") " pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.501740 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.517748 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c78a2319-04ae-46f7-8b1c-7baec60f1dc6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v76jt\" (UID: \"c78a2319-04ae-46f7-8b1c-7baec60f1dc6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.518120 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.518401 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.018387607 +0000 UTC m=+137.220882745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.521791 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.525370 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.528953 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.545471 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.549657 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgs7\" (UniqueName: \"kubernetes.io/projected/12c5c094-a435-4607-9e17-82b36d40c672-kube-api-access-qwgs7\") pod \"csi-hostpathplugin-crrxv\" (UID: \"12c5c094-a435-4607-9e17-82b36d40c672\") " pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.568507 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjbbz"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.568623 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.569972 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29p88\" (UniqueName: \"kubernetes.io/projected/30d6cc0f-1d5e-4675-9e42-69c79794ff83-kube-api-access-29p88\") pod \"machine-config-server-d6pbv\" (UID: \"30d6cc0f-1d5e-4675-9e42-69c79794ff83\") " pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.574206 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" event={"ID":"cf63e97f-f001-452c-8d17-1b8a4e40c3ae","Type":"ContainerStarted","Data":"530edc09990a78f85e12bfbd43ccd928040dad4f6a26b84b00fc6141b1f5d132"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.589651 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.597349 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.611262 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wmz6h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.605380 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wbp5x"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.609410 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf5dc\" (UniqueName: \"kubernetes.io/projected/00c23670-fc21-4730-a27e-ac490261f994-kube-api-access-sf5dc\") pod \"control-plane-machine-set-operator-78cbb6b69f-smktm\" (UID: \"00c23670-fc21-4730-a27e-ac490261f994\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.600943 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfmvd\" (UniqueName: \"kubernetes.io/projected/d14ea7a8-bb04-40c4-a285-a69a80f1bc5a-kube-api-access-mfmvd\") pod \"machine-config-operator-74547568cd-c567c\" (UID: \"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.615853 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wvc8w" event={"ID":"418a0125-b167-49b8-b6bd-0c97a587107c","Type":"ContainerStarted","Data":"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.616109 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wvc8w" event={"ID":"418a0125-b167-49b8-b6bd-0c97a587107c","Type":"ContainerStarted","Data":"e7e84d1d403aace90911e374e0f88713e50ce0bb106a4a61448f5b0a249d1d11"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.621019 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.621363 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.621478 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.121426811 +0000 UTC m=+137.323921949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.621891 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.622547 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.122533708 +0000 UTC m=+137.325028845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.625494 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kzl\" (UniqueName: \"kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl\") pod \"marketplace-operator-79b997595-f7mzc\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.642700 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" event={"ID":"6becb00b-a391-46f2-b821-c626b7903924","Type":"ContainerStarted","Data":"3910e6cf13f0693ae08ed44f27dbc49862667d3025a9e564fbd129dc6659758b"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.649104 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.656104 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.661041 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d6pbv" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.675357 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc24q\" (UniqueName: \"kubernetes.io/projected/3193422a-c236-406b-a51a-e2865b720ff4-kube-api-access-qc24q\") pod \"multus-admission-controller-857f4d67dd-2fq46\" (UID: \"3193422a-c236-406b-a51a-e2865b720ff4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.677140 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46d9\" (UniqueName: \"kubernetes.io/projected/66eb132d-fe22-490c-bd19-3c52de3b56ee-kube-api-access-d46d9\") pod \"packageserver-d55dfcdfc-f8pjc\" (UID: \"66eb132d-fe22-490c-bd19-3c52de3b56ee\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.685503 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtvbv\" (UniqueName: \"kubernetes.io/projected/cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c-kube-api-access-gtvbv\") pod \"apiserver-7bbb656c7d-vj2db\" (UID: \"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.689321 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" event={"ID":"ebcf1227-407e-41f6-bf75-d755ecfd724f","Type":"ContainerStarted","Data":"08de76df26fab0615fb2bd5bf0e965d4c27925ac51ce7fdc5d18fff7349e85b4"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.689362 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" event={"ID":"ebcf1227-407e-41f6-bf75-d755ecfd724f","Type":"ContainerStarted","Data":"deac3d4befcf6cb201ccbbb2da09119b0df4d2eb875489757b992e51ecb03159"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.695242 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" event={"ID":"4792f05e-1c4a-4c87-991d-e26851501f52","Type":"ContainerStarted","Data":"6bc1c9f912d96da3c76c6be4ae2478c5379d15208b7b5a49dae27b9adef09021"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.695269 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" event={"ID":"4792f05e-1c4a-4c87-991d-e26851501f52","Type":"ContainerStarted","Data":"abf3474173968dff3231789803664f976480bf7c34d7ec4351be24b22df64ac2"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.706511 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdnw\" (UniqueName: \"kubernetes.io/projected/4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0-kube-api-access-9bdnw\") pod \"olm-operator-6b444d44fb-w788b\" (UID: \"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.709860 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" event={"ID":"262e9d06-4e95-4017-85d9-5657f520eb49","Type":"ContainerStarted","Data":"455edfaa548c4d82ec29d331aa2ae7d14a6e3769691c2def943b2abf7b648d10"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.711111 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" event={"ID":"684d871e-b542-4963-be08-7dba0c7b6d6a","Type":"ContainerStarted","Data":"80454e6aaec800da3bcfe866e71a6bee9fd2a9b6e097c928b6d4f9f5793d28ae"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.723475 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.724725 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.224708049 +0000 UTC m=+137.427203186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.729043 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5j5\" (UniqueName: \"kubernetes.io/projected/f49defba-4d11-4d11-8a4f-d5fcbe187c73-kube-api-access-vm5j5\") pod \"catalog-operator-68c6474976-c9xwj\" (UID: \"f49defba-4d11-4d11-8a4f-d5fcbe187c73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.730101 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6mnps" event={"ID":"9ba89570-d3db-4eb1-a768-a07858224030","Type":"ContainerStarted","Data":"0f13cd386a07f7a2736b28cf3e2c3ee1637a82e5610f71a1b46174a597052363"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.730134 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6mnps" event={"ID":"9ba89570-d3db-4eb1-a768-a07858224030","Type":"ContainerStarted","Data":"6be8d54f6ae82039df7a780c42d299e6bb619f6833293f03c948f98463d6ad07"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.730553 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.743423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" event={"ID":"0587232c-6c1f-44d8-b7d0-be44d147bd71","Type":"ContainerStarted","Data":"206901cadd6a22d6bb8cb64d1821d1f9fc73279692e45d17303f1f42fbb67aa3"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.746391 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pgjmj"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.757467 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrwl\" (UniqueName: \"kubernetes.io/projected/74651f90-e9e9-4c93-a929-26765c304243-kube-api-access-qrrwl\") pod \"service-ca-9c57cc56f-q5drb\" (UID: \"74651f90-e9e9-4c93-a929-26765c304243\") " pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.767077 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs"] Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.785004 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" event={"ID":"d9d37581-10f7-4e98-81e1-6a17ef4a527a","Type":"ContainerStarted","Data":"9dcf43ac0130e9930e6f54253538d094ae838abd781150edfe03d64d2df1550d"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.785433 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" event={"ID":"d9d37581-10f7-4e98-81e1-6a17ef4a527a","Type":"ContainerStarted","Data":"cbf6c16f72625e2153c1cd68de024d9dfa0125d12713d23e9dfb67e542bff93b"} Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.786306 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.799345 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6mnps" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.808600 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.814512 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.819417 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.827480 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.830166 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.33015421 +0000 UTC m=+137.532649347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.840120 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.851487 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.859370 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.873587 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.902158 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.905341 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.920132 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:43 crc kubenswrapper[4565]: I1125 09:06:43.930067 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:43 crc kubenswrapper[4565]: E1125 09:06:43.931081 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.431066622 +0000 UTC m=+137.633561760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.034851 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.035163 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.535151719 +0000 UTC m=+137.737646857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.077024 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.136366 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.136460 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.63644751 +0000 UTC m=+137.838942649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.136856 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.137193 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.637178413 +0000 UTC m=+137.839673551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.161696 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.237657 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.252617 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.752593591 +0000 UTC m=+137.955088729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.282503 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.285645 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.785634386 +0000 UTC m=+137.988129524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.327021 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.387238 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.387760 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.887744476 +0000 UTC m=+138.090239615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.431984 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.492565 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.492963 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:44.992951748 +0000 UTC m=+138.195446887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.565520 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.575722 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6mnps" podStartSLOduration=119.575702699 podStartE2EDuration="1m59.575702699s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:44.569631206 +0000 UTC m=+137.772126344" watchObservedRunningTime="2025-11-25 09:06:44.575702699 +0000 UTC m=+137.778197838" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.594335 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.594896 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.094882343 +0000 UTC m=+138.297377480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.665998 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.680065 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wvc8w" podStartSLOduration=119.68005127 podStartE2EDuration="1m59.68005127s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:44.664526197 +0000 UTC m=+137.867021334" watchObservedRunningTime="2025-11-25 09:06:44.68005127 +0000 UTC m=+137.882546409" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.682019 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.695660 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.695960 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.195949675 +0000 UTC m=+138.398444813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: W1125 09:06:44.733524 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b598734_79f8_4a64_9f30_e08bd6c56bc7.slice/crio-cb52e65819da2e3fe4858f41ea9696f776181ec7edf52d6d798e71c8a56f43d0 WatchSource:0}: Error finding container cb52e65819da2e3fe4858f41ea9696f776181ec7edf52d6d798e71c8a56f43d0: Status 404 returned error can't find the container with id cb52e65819da2e3fe4858f41ea9696f776181ec7edf52d6d798e71c8a56f43d0 Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.757427 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.790843 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wmz6h"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.798561 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.799025 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.299011762 +0000 UTC m=+138.501506890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.803123 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7"] Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.832481 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" event={"ID":"29e91702-afc2-470f-a3b9-9be851b01f9c","Type":"ContainerStarted","Data":"4cde4549eefc859b20747c28cdf16f857994d8c09c9c4defb0583c8061000777"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.836035 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" event={"ID":"684d871e-b542-4963-be08-7dba0c7b6d6a","Type":"ContainerStarted","Data":"5f5b13af332582ca68525477848651f7ff8b47fe7930eba79537694c51ff1c4d"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.836166 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.841669 4565 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7lhlj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.841700 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.846982 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" event={"ID":"262e9d06-4e95-4017-85d9-5657f520eb49","Type":"ContainerStarted","Data":"060008ea0c1a869086a3484d3b05fff881fc595ddf47a3754b0a425a01ff8665"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.848723 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wbp5x" event={"ID":"cd73c816-12d9-42bc-bc10-2180ed13b36a","Type":"ContainerStarted","Data":"3f438cd5ce9d7e6b655f0d0cf6f414b8e34aa269973dc7a2376d7df197531176"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.848767 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wbp5x" event={"ID":"cd73c816-12d9-42bc-bc10-2180ed13b36a","Type":"ContainerStarted","Data":"87fd07f9b6846d97eaebee65d1cb0afc837dc0a2015dcd6d487198da14b7a1be"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.850374 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.855805 4565 patch_prober.go:28] interesting pod/downloads-7954f5f757-wbp5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.855846 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wbp5x" podUID="cd73c816-12d9-42bc-bc10-2180ed13b36a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.856088 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" event={"ID":"6dc4fc3b-ec76-4231-a2ed-d623be3502c0","Type":"ContainerStarted","Data":"1b02476425494ae903ce7e2f599161047121115d245bdb46fe002a93706c4904"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.864454 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" event={"ID":"4841ec62-bf33-407a-b240-1409e85aa22e","Type":"ContainerStarted","Data":"80a17fc9f546997077dab57ac3bd4fee282254f298baaa94792bd0dff5bfdc95"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.868156 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d6pbv" event={"ID":"30d6cc0f-1d5e-4675-9e42-69c79794ff83","Type":"ContainerStarted","Data":"c26cc26bb717764e17fe23847ca5bbb0df778c16fd13669f7d0960dc066293c1"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.886274 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" event={"ID":"394e1c70-96f6-4655-bb1a-50850d43136e","Type":"ContainerStarted","Data":"7b6ada9681148046e61890c5623fd6db52a4b94bd711cd2e0604ad9cc5f3ac26"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.891883 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" podStartSLOduration=119.891871233 podStartE2EDuration="1m59.891871233s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:44.852981231 +0000 UTC m=+138.055476369" watchObservedRunningTime="2025-11-25 09:06:44.891871233 +0000 UTC m=+138.094366371" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.892480 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ls4wr" podStartSLOduration=119.892473684 podStartE2EDuration="1m59.892473684s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:44.889543806 +0000 UTC m=+138.092038944" watchObservedRunningTime="2025-11-25 09:06:44.892473684 +0000 UTC m=+138.094968821" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.896918 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" event={"ID":"25752784-0e60-4517-ba99-754385fa0ecb","Type":"ContainerStarted","Data":"a93ef03e897365d26f556124cf783ab4c929829c11f07965ecdc4239557b0a9f"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.900013 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:44 crc kubenswrapper[4565]: E1125 09:06:44.900733 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.400723856 +0000 UTC m=+138.603218994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.911645 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" event={"ID":"918a1a54-550d-4e79-b0c9-360d88fd8a6e","Type":"ContainerStarted","Data":"4853e6f4388f7e8de9eea9a797923f7e32b7c28959d7a6f7e26d88373f9c65d4"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.964143 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" event={"ID":"cf63e97f-f001-452c-8d17-1b8a4e40c3ae","Type":"ContainerStarted","Data":"104fd183c015dc37504756f2d82b42294c182f6759b6d76a15ea82d55bdcfbeb"} Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.974167 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-72h5b" podStartSLOduration=119.974152151 podStartE2EDuration="1m59.974152151s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:44.946577735 +0000 UTC m=+138.149072873" watchObservedRunningTime="2025-11-25 09:06:44.974152151 +0000 UTC m=+138.176647290" Nov 25 09:06:44 crc kubenswrapper[4565]: I1125 09:06:44.996558 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" event={"ID":"bebd549b-749f-4e30-ab21-99c32a85c0ca","Type":"ContainerStarted","Data":"af95ff0a45bdfd55fb70203d7861b5f28b02637c955b4fc5ed5fcf1a83dcc638"} Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.002140 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.002556 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.502541868 +0000 UTC m=+138.705037006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.017710 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.025675 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.028574 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" event={"ID":"4b598734-79f8-4a64-9f30-e08bd6c56bc7","Type":"ContainerStarted","Data":"cb52e65819da2e3fe4858f41ea9696f776181ec7edf52d6d798e71c8a56f43d0"} Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.050227 4565 generic.go:334] "Generic (PLEG): container finished" podID="6becb00b-a391-46f2-b821-c626b7903924" containerID="05596658981f10648518dd555573a5927145a4dd0c9889838d98849fb551945e" exitCode=0 Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.050288 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" event={"ID":"6becb00b-a391-46f2-b821-c626b7903924","Type":"ContainerDied","Data":"05596658981f10648518dd555573a5927145a4dd0c9889838d98849fb551945e"} Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.059106 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bdn5h"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.077347 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5ktx" event={"ID":"66cd3426-ad3a-499d-8b09-056348e1413a","Type":"ContainerStarted","Data":"b54ad64f809e7aeeaf596abbe21fc66280a1d3fe6dcd335f2bbc195f359df130"} Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.092666 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.103752 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.104436 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.604420714 +0000 UTC m=+138.806915853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.207595 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.207772 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.707750895 +0000 UTC m=+138.910246032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.208236 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.208777 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.708769946 +0000 UTC m=+138.911265084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.219328 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.272425 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2fq46"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.317641 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.317751 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.817704483 +0000 UTC m=+139.020199622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.318188 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.318535 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.818523571 +0000 UTC m=+139.021018708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.322778 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.324649 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-crrxv"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.334919 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.371530 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.386737 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.387100 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.387240 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q5drb"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.391876 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.413951 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" podStartSLOduration=120.413918489 podStartE2EDuration="2m0.413918489s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:45.411923606 +0000 UTC m=+138.614437880" watchObservedRunningTime="2025-11-25 09:06:45.413918489 +0000 UTC m=+138.616413627" Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.419201 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.420284 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:45.91954583 +0000 UTC m=+139.122040968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.445391 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c567c"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.465652 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8x25t"] Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.503443 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wbp5x" podStartSLOduration=120.503429676 podStartE2EDuration="2m0.503429676s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:45.502160925 +0000 UTC m=+138.704656064" watchObservedRunningTime="2025-11-25 09:06:45.503429676 +0000 UTC m=+138.705924814" Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.520361 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.520626 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.020615888 +0000 UTC m=+139.223111026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.630831 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.631709 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.131690762 +0000 UTC m=+139.334185900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.733476 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.734271 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.234260996 +0000 UTC m=+139.436756134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.837550 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.838218 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.338202833 +0000 UTC m=+139.540697971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:45 crc kubenswrapper[4565]: I1125 09:06:45.942309 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:45 crc kubenswrapper[4565]: E1125 09:06:45.942685 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.442674606 +0000 UTC m=+139.645169744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.042916 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.043337 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.543269092 +0000 UTC m=+139.745764230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.144145 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.144743 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.64473285 +0000 UTC m=+139.847227988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.150805 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" event={"ID":"4841ec62-bf33-407a-b240-1409e85aa22e","Type":"ContainerStarted","Data":"894dafec4ec84dedfba354dd57807716a87dfda9627e82c47a2292b5158bf250"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.150859 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" event={"ID":"4841ec62-bf33-407a-b240-1409e85aa22e","Type":"ContainerStarted","Data":"23485916841df05df0630b06e97f813968cc0eb7d430f4c618bd8efc3ae4511a"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.153401 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" event={"ID":"262e9d06-4e95-4017-85d9-5657f520eb49","Type":"ContainerStarted","Data":"15289812c099b810d6a1d3991264cb6858d3483618bd89808a3a84ac75327681"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.155070 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" event={"ID":"c78a2319-04ae-46f7-8b1c-7baec60f1dc6","Type":"ContainerStarted","Data":"76210d75e93d10ae0b6c28578abd29e8314caaedfb1d859dd914e4e1d99bf5e6"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.157478 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" event={"ID":"0587232c-6c1f-44d8-b7d0-be44d147bd71","Type":"ContainerStarted","Data":"014e0229a3973c09504d88632fc6018e607301faa2ac3c6163007ad604584677"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.162071 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" event={"ID":"7cc1943a-ad05-4d90-89c1-2e88e30d2c56","Type":"ContainerStarted","Data":"b9b83e9b61fafec785048b637add32a26ab212bc75164f67ee3075e0dbe64f8d"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.162110 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" event={"ID":"7cc1943a-ad05-4d90-89c1-2e88e30d2c56","Type":"ContainerStarted","Data":"597d21dfe65c3a06b3c51f38eb90689e990b1bf3ec5bb179c4ad70d87666ae32"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.168431 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" event={"ID":"a888c987-e97d-4f33-9932-158161870fe6","Type":"ContainerStarted","Data":"a194d5f1f16bf081a4b2ca436ff71e4b5474979d0e082b6fc5639848ce9d1900"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.172706 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5ktx" event={"ID":"66cd3426-ad3a-499d-8b09-056348e1413a","Type":"ContainerStarted","Data":"cda6a327a0b8b4989218521fc42f9afef43a9f152f57d79a88105f001e505fee"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.218550 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pzm4n" podStartSLOduration=121.2185314 podStartE2EDuration="2m1.2185314s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.176906136 +0000 UTC m=+139.379401273" watchObservedRunningTime="2025-11-25 09:06:46.2185314 +0000 UTC m=+139.421026538" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.235239 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" event={"ID":"4b598734-79f8-4a64-9f30-e08bd6c56bc7","Type":"ContainerStarted","Data":"8c98b4cd32fb83f00ee030f434e189b3fbb80ae68298e1bc1398e7d524efa251"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.248497 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.248613 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.748592282 +0000 UTC m=+139.951087420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.248879 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.250563 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.750550046 +0000 UTC m=+139.953045184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.266938 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" event={"ID":"25752784-0e60-4517-ba99-754385fa0ecb","Type":"ContainerStarted","Data":"4555fd5295410046af2c81ba8550775e5a376c50b6f56099dd2e9b5a2f786e44"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.269441 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h5ktx" podStartSLOduration=121.269427242 podStartE2EDuration="2m1.269427242s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.219405831 +0000 UTC m=+139.421900969" watchObservedRunningTime="2025-11-25 09:06:46.269427242 +0000 UTC m=+139.471922380" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.270893 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" podStartSLOduration=122.270885338 podStartE2EDuration="2m2.270885338s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.251429437 +0000 UTC m=+139.453924575" watchObservedRunningTime="2025-11-25 09:06:46.270885338 +0000 UTC m=+139.473380476" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.300471 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8xkx9" podStartSLOduration=121.300458244 podStartE2EDuration="2m1.300458244s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.292157206 +0000 UTC m=+139.494652345" watchObservedRunningTime="2025-11-25 09:06:46.300458244 +0000 UTC m=+139.502953382" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.309474 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" event={"ID":"918a1a54-550d-4e79-b0c9-360d88fd8a6e","Type":"ContainerStarted","Data":"d57af581900e0e3f7629a92a832dbad9462fbca9873f6e7c9a3aa3cceb301609"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.311681 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" event={"ID":"29e91702-afc2-470f-a3b9-9be851b01f9c","Type":"ContainerStarted","Data":"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.312273 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.318912 4565 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nzk24 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.318965 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.342308 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" event={"ID":"6dc4fc3b-ec76-4231-a2ed-d623be3502c0","Type":"ContainerStarted","Data":"0ab5b448a4bb8fddc98b0722c340dd63b46a286ecb78c3e4189bf8dfe06e21fe"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.352210 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.354029 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.854010501 +0000 UTC m=+140.056505639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.397662 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" event={"ID":"f49defba-4d11-4d11-8a4f-d5fcbe187c73","Type":"ContainerStarted","Data":"074b236a6e1035a8b38e70dbaac8853b8b17a0bc07c047826eb18aaf149e0627"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.465211 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" podStartSLOduration=122.465185061 podStartE2EDuration="2m2.465185061s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.465107235 +0000 UTC m=+139.667602373" watchObservedRunningTime="2025-11-25 09:06:46.465185061 +0000 UTC m=+139.667680199" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.466229 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.466304 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r5s6w" podStartSLOduration=121.466295175 podStartE2EDuration="2m1.466295175s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.377330032 +0000 UTC m=+139.579825170" watchObservedRunningTime="2025-11-25 09:06:46.466295175 +0000 UTC m=+139.668790313" Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.466519 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:46.966508384 +0000 UTC m=+140.169003523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.493087 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d6pbv" event={"ID":"30d6cc0f-1d5e-4675-9e42-69c79794ff83","Type":"ContainerStarted","Data":"4a692eef973687aeed93d05b82fdc6017b637ec5e17c04441b1f6776415dd77e"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.506836 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.514307 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:46 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:46 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:46 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.514371 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.542171 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wmz6h" event={"ID":"6148f932-f59f-4a65-8094-84fa424d40bb","Type":"ContainerStarted","Data":"010381ea422a520b8642a585ad56745615990d6e27b60dbe55dc6e843840c1c3"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.542218 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wmz6h" event={"ID":"6148f932-f59f-4a65-8094-84fa424d40bb","Type":"ContainerStarted","Data":"295764f8571f446523c01c1860f0bc0b45af80dcc90fa3cd880bac02f8ed44fe"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.558854 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gm2dl" podStartSLOduration=121.558829595 podStartE2EDuration="2m1.558829595s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.558611215 +0000 UTC m=+139.761106354" watchObservedRunningTime="2025-11-25 09:06:46.558829595 +0000 UTC m=+139.761324733" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.559604 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" event={"ID":"66eb132d-fe22-490c-bd19-3c52de3b56ee","Type":"ContainerStarted","Data":"695e25a59d529c19f82db986d2b87452e679c7023cc6f4cab1ae39ec4875c228"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.560405 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.566326 4565 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-f8pjc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.566358 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" podUID="66eb132d-fe22-490c-bd19-3c52de3b56ee" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.573102 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.574396 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.074380759 +0000 UTC m=+140.276875897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.592141 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" event={"ID":"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a","Type":"ContainerStarted","Data":"6428feaa93c3d91e802634a4f6905a0f481c5947d38d3991b05cc30a3fb25cd9"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.603075 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" event={"ID":"d2b8d9fc-1eab-46f7-a214-59255f20c0db","Type":"ContainerStarted","Data":"3380440fd3d6c29cd551323d50e81236752fe1993a1636a767dd511183edefa1"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.603104 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" event={"ID":"d2b8d9fc-1eab-46f7-a214-59255f20c0db","Type":"ContainerStarted","Data":"d0a0e9c8ff88a5db0b703bfb9d8d52bf9018db95c0e2f40ee94832514dd74271"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.645986 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pgjmj" podStartSLOduration=122.645966538 podStartE2EDuration="2m2.645966538s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.644975998 +0000 UTC m=+139.847471136" watchObservedRunningTime="2025-11-25 09:06:46.645966538 +0000 UTC m=+139.848461675" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.651757 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" event={"ID":"74651f90-e9e9-4c93-a929-26765c304243","Type":"ContainerStarted","Data":"4e10414337f75916664a3026569675b9d729489f73d2d614545bed5582cf1fd7"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.667244 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" event={"ID":"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba","Type":"ContainerStarted","Data":"c5e4239a12a9117e2e58b4e851472b4441bcef2bd06bfc3b9bf1160d32835661"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.667286 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" event={"ID":"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba","Type":"ContainerStarted","Data":"249c8d603942a7ba144d51e60e76aa98df3e8b4be91649053f09cc4e79a35c9d"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.675591 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.676708 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.176696915 +0000 UTC m=+140.379192054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.679104 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" event={"ID":"12c5c094-a435-4607-9e17-82b36d40c672","Type":"ContainerStarted","Data":"7184a7be2633af275d0cffe0168e03e8900b5c10ff3bfa33b11324e4cda83076"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.691193 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fr2kb" podStartSLOduration=121.691181207 podStartE2EDuration="2m1.691181207s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.688970207 +0000 UTC m=+139.891465346" watchObservedRunningTime="2025-11-25 09:06:46.691181207 +0000 UTC m=+139.893676336" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.691578 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" event={"ID":"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0","Type":"ContainerStarted","Data":"33fa85745f9514b568c20f04a57938a1842f3f44c51bf02c48cf82cfe32b86ab"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.716320 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" event={"ID":"6becb00b-a391-46f2-b821-c626b7903924","Type":"ContainerStarted","Data":"e485ca4af49073f03739efe941b506ca09aca0bb35da4827f4dd9bedb4ee1261"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.730073 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" event={"ID":"91f8c44a-7b95-4212-8976-753251e9959b","Type":"ContainerStarted","Data":"5e6c3fb2fa614ef47a443e9b9b79e62027e238b7f9774356c60c8993b07b4982"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.776339 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.776673 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.276658493 +0000 UTC m=+140.479153632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.808017 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" event={"ID":"cf63e97f-f001-452c-8d17-1b8a4e40c3ae","Type":"ContainerStarted","Data":"bd4f04bb0221b44631e88a6b99f80030f1ec3c6c327be0163f117b54031d95e2"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.813543 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" podStartSLOduration=121.81353147 podStartE2EDuration="2m1.81353147s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.762242972 +0000 UTC m=+139.964738110" watchObservedRunningTime="2025-11-25 09:06:46.81353147 +0000 UTC m=+140.016026608" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.814123 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wmz6h" podStartSLOduration=6.814118953 podStartE2EDuration="6.814118953s" podCreationTimestamp="2025-11-25 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.812196304 +0000 UTC m=+140.014691443" watchObservedRunningTime="2025-11-25 09:06:46.814118953 +0000 UTC m=+140.016614091" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.857688 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" event={"ID":"3193422a-c236-406b-a51a-e2865b720ff4","Type":"ContainerStarted","Data":"3bb039cbf4e7e99f3c0d9f8c3b07342eff5f9e976b6316a6c367b3544dae2b01"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.858346 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-d6pbv" podStartSLOduration=6.858331521 podStartE2EDuration="6.858331521s" podCreationTimestamp="2025-11-25 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.858115266 +0000 UTC m=+140.060610404" watchObservedRunningTime="2025-11-25 09:06:46.858331521 +0000 UTC m=+140.060826660" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.884000 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.884258 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.384248749 +0000 UTC m=+140.586743887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.903793 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" event={"ID":"394e1c70-96f6-4655-bb1a-50850d43136e","Type":"ContainerStarted","Data":"79899ca1a3d7669abb80883b500b8d5ee818bfba9473899d6c01f7ebe19c9390"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.920506 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" event={"ID":"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c","Type":"ContainerStarted","Data":"2b69268c0442c8bfe7c2979ec7818175bb1360df1c6c38b6298e6255062004e9"} Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.979443 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n4r7s" podStartSLOduration=122.979413956 podStartE2EDuration="2m2.979413956s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:46.974528227 +0000 UTC m=+140.177023365" watchObservedRunningTime="2025-11-25 09:06:46.979413956 +0000 UTC m=+140.181909094" Nov 25 09:06:46 crc kubenswrapper[4565]: I1125 09:06:46.985126 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:46 crc kubenswrapper[4565]: E1125 09:06:46.986281 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.486268369 +0000 UTC m=+140.688763507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.015554 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" event={"ID":"00c23670-fc21-4730-a27e-ac490261f994","Type":"ContainerStarted","Data":"15a9aaf06f0f7d4b96311f214ab766b727f4649583c2d0865fd4d2c410b3213e"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.059387 4565 generic.go:334] "Generic (PLEG): container finished" podID="bebd549b-749f-4e30-ab21-99c32a85c0ca" containerID="bc14f1149fddf1ad6673914efda161afccd5a1ec71aae2ae6436985fbc52a716" exitCode=0 Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.059672 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" event={"ID":"bebd549b-749f-4e30-ab21-99c32a85c0ca","Type":"ContainerDied","Data":"bc14f1149fddf1ad6673914efda161afccd5a1ec71aae2ae6436985fbc52a716"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.087568 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" event={"ID":"befd0413-a4be-4b8f-9070-6a27ca4c2ca8","Type":"ContainerStarted","Data":"b19ca706e8ae5910aa4a7a75277f9243b3549fb2ffd3a9ec69754bf0a365f245"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.088701 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.090062 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kjbbz" podStartSLOduration=122.090044475 podStartE2EDuration="2m2.090044475s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:47.088111839 +0000 UTC m=+140.290606977" watchObservedRunningTime="2025-11-25 09:06:47.090044475 +0000 UTC m=+140.292539613" Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.090098 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.590087115 +0000 UTC m=+140.792582253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.114532 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-926vb" podStartSLOduration=123.114517792 podStartE2EDuration="2m3.114517792s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:47.114209524 +0000 UTC m=+140.316704662" watchObservedRunningTime="2025-11-25 09:06:47.114517792 +0000 UTC m=+140.317012931" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.139302 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" event={"ID":"86e733b7-6fa7-4086-bb82-95cc1b9dfa51","Type":"ContainerStarted","Data":"bf9d78e7e723fea70f97f784fee852e10eb86dc4cb0fcb1bd83d3b09667c1a0b"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.139329 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" event={"ID":"86e733b7-6fa7-4086-bb82-95cc1b9dfa51","Type":"ContainerStarted","Data":"f24e1682c8bb038c0f6561eb6b674dc3caa94e54693f35d357d273687fcb1ee4"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.139340 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" event={"ID":"96805dde-4050-4b05-a10b-e7a1f1ed37c7","Type":"ContainerStarted","Data":"b9def1ca778519507d884c5bd22b81402df133b314096a0274fb19d6471c57eb"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.141546 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdn5h" event={"ID":"f9857afe-8e2a-4ddd-8fc2-2059bff84d86","Type":"ContainerStarted","Data":"3c94dcbc293ea5d8fb0ed3365b1211ff1b74c42c50dd8fa730453d54a42e9945"} Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.143100 4565 patch_prober.go:28] interesting pod/downloads-7954f5f757-wbp5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.143128 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wbp5x" podUID="cd73c816-12d9-42bc-bc10-2180ed13b36a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.166409 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.189406 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.193654 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.693639492 +0000 UTC m=+140.896134630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.232815 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" podStartSLOduration=122.232783991 podStartE2EDuration="2m2.232783991s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:47.230122326 +0000 UTC m=+140.432617464" watchObservedRunningTime="2025-11-25 09:06:47.232783991 +0000 UTC m=+140.435279129" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.288377 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" podStartSLOduration=122.288357772 podStartE2EDuration="2m2.288357772s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:47.285133561 +0000 UTC m=+140.487628698" watchObservedRunningTime="2025-11-25 09:06:47.288357772 +0000 UTC m=+140.490852909" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.290768 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.291228 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.791215714 +0000 UTC m=+140.993710852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.392630 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.393639 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.893625558 +0000 UTC m=+141.096120696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.494767 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.495067 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:47.995054429 +0000 UTC m=+141.197549567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.519983 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:47 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:47 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:47 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.520031 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.605166 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.605912 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.105895163 +0000 UTC m=+141.308390302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.710966 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.711287 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.211274559 +0000 UTC m=+141.413769697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.811468 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.811913 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.311899693 +0000 UTC m=+141.514394830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:47 crc kubenswrapper[4565]: I1125 09:06:47.912639 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:47 crc kubenswrapper[4565]: E1125 09:06:47.912994 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.412981694 +0000 UTC m=+141.615476832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.013819 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.014140 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.514091637 +0000 UTC m=+141.716586775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.014351 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.014751 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.514738361 +0000 UTC m=+141.717233499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.115253 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.115598 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.615582626 +0000 UTC m=+141.818077764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.157075 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" event={"ID":"4b598734-79f8-4a64-9f30-e08bd6c56bc7","Type":"ContainerStarted","Data":"7277e2cb42530cf14a7f701b8fdad8e64d471398ae26bcf49be498816a47583a"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.159630 4565 generic.go:334] "Generic (PLEG): container finished" podID="cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c" containerID="bb49ce4b65c23b812e99b0e098f996415a4000cdc4493e6c3966aa15936c6e5c" exitCode=0 Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.159916 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" event={"ID":"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c","Type":"ContainerStarted","Data":"d517fe0a1884adcbb80ecea3b31ec86abf3883e5d3e899d88dc7514abb62e849"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.160008 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" event={"ID":"cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c","Type":"ContainerDied","Data":"bb49ce4b65c23b812e99b0e098f996415a4000cdc4493e6c3966aa15936c6e5c"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.162266 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" event={"ID":"91f8c44a-7b95-4212-8976-753251e9959b","Type":"ContainerStarted","Data":"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.162967 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.164153 4565 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-f7mzc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.164193 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.165244 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" event={"ID":"74651f90-e9e9-4c93-a929-26765c304243","Type":"ContainerStarted","Data":"b5b70818f72ff5beae32e90e45f8d3c3a2c2824d5269f2210a3fa5c672db00d5"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.167668 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" event={"ID":"f1364197-953a-4e1b-ad5d-aff1a5f0f5ba","Type":"ContainerStarted","Data":"8d0e602e47ea2340fbd42d0fccfec07e308e5dcb8be499997c5016049e270d79"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.168109 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.170712 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" event={"ID":"3193422a-c236-406b-a51a-e2865b720ff4","Type":"ContainerStarted","Data":"31e346d397c467c9f803378aa5a991df523f1189439e2c8022f0d1ba4bf705ad"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.170745 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" event={"ID":"3193422a-c236-406b-a51a-e2865b720ff4","Type":"ContainerStarted","Data":"b48331e682dcc1de9fad4f5d760a686caaad91c244f3af5cab55030d0ccc2a66"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.173204 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" event={"ID":"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a","Type":"ContainerStarted","Data":"1b62bb00dbb5dd8fe1a1c0cbc876135718d624ba6a856c6c35337d3b499fafaa"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.173235 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" event={"ID":"d14ea7a8-bb04-40c4-a285-a69a80f1bc5a","Type":"ContainerStarted","Data":"ac3e1379a34ab265abfb1d2f9a1a2396e4ea02c91f225d7361e1042ac42f3318"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.180179 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-smktm" event={"ID":"00c23670-fc21-4730-a27e-ac490261f994","Type":"ContainerStarted","Data":"a07454b24fadd57cfd61ab04b215edd6abf3703d5c319128e56010a8dc8c8a91"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.182598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" event={"ID":"86e733b7-6fa7-4086-bb82-95cc1b9dfa51","Type":"ContainerStarted","Data":"a3d9504501b5688cfd9b2a963ab6abc364af1ec206b41949df95a72e81f901d1"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.205382 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tj8jh" podStartSLOduration=123.205370692 podStartE2EDuration="2m3.205370692s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.202291012 +0000 UTC m=+141.404786151" watchObservedRunningTime="2025-11-25 09:06:48.205370692 +0000 UTC m=+141.407865831" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.225275 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.236171 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.736156295 +0000 UTC m=+141.938651432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.254134 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" event={"ID":"96805dde-4050-4b05-a10b-e7a1f1ed37c7","Type":"ContainerStarted","Data":"a507d102ca6bbacb32654f9b86b8c61817ad2ad32059bd8a72cce48d7579a21b"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.254189 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" event={"ID":"96805dde-4050-4b05-a10b-e7a1f1ed37c7","Type":"ContainerStarted","Data":"aced1f6abdf4e1056e249cd005f3d87a12fc481863e5f2395900032f7e026190"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.255913 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" event={"ID":"f49defba-4d11-4d11-8a4f-d5fcbe187c73","Type":"ContainerStarted","Data":"f000c323b2794a25b25502c10aee81d9e4de75a9d99c1c1ec39de1c596141738"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.256722 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.257953 4565 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c9xwj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.257987 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" podUID="f49defba-4d11-4d11-8a4f-d5fcbe187c73" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.266100 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" event={"ID":"a888c987-e97d-4f33-9932-158161870fe6","Type":"ContainerStarted","Data":"0cef7b14ef30b0eb4abaff2d20167e00b58f773da50f35150e7ca1b544ff4265"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.269870 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" event={"ID":"66eb132d-fe22-490c-bd19-3c52de3b56ee","Type":"ContainerStarted","Data":"736472f0067b17e718bc3fa72d813bbc3f1284d558e625d63cca64f3bfd9f62b"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.274051 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2fq46" podStartSLOduration=123.274039877 podStartE2EDuration="2m3.274039877s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.26276604 +0000 UTC m=+141.465261179" watchObservedRunningTime="2025-11-25 09:06:48.274039877 +0000 UTC m=+141.476535015" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.274778 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" event={"ID":"c78a2319-04ae-46f7-8b1c-7baec60f1dc6","Type":"ContainerStarted","Data":"d6e884b422343d02cc7f3ef2e0c7cba4e3779894466fff074ed4f896725c0d22"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.289837 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" event={"ID":"6becb00b-a391-46f2-b821-c626b7903924","Type":"ContainerStarted","Data":"29d7cfffc2d5c554719770cf8fca5a9a49ce2a39332ac8993cfa978142c20415"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.294667 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" event={"ID":"12c5c094-a435-4607-9e17-82b36d40c672","Type":"ContainerStarted","Data":"f2557552b0c3590a10315b9f22aa2994fc103a3634228d6f690b73843b442c1e"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.297168 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" event={"ID":"4fe46bc1-6ac5-4578-a9d1-7ce7345b8ff0","Type":"ContainerStarted","Data":"c6c1c600a83cf0c72c54e2b02bc6a8a55bacee25f49a6e6dce3e399c47c0ab21"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.297671 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.312123 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8x25t" event={"ID":"befd0413-a4be-4b8f-9070-6a27ca4c2ca8","Type":"ContainerStarted","Data":"a656f6aa4c140f8cdc0a044a23522e1f8c22271b786acd0d42f56c7d6f071ae3"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.324103 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-q5drb" podStartSLOduration=123.32408881 podStartE2EDuration="2m3.32408881s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.32174964 +0000 UTC m=+141.524244777" watchObservedRunningTime="2025-11-25 09:06:48.32408881 +0000 UTC m=+141.526583948" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.325506 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" event={"ID":"0587232c-6c1f-44d8-b7d0-be44d147bd71","Type":"ContainerStarted","Data":"64ea30aa932b74caff940d6114e21d09b302b950cbedfcf303f484b553850139"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.336408 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdn5h" event={"ID":"f9857afe-8e2a-4ddd-8fc2-2059bff84d86","Type":"ContainerStarted","Data":"3b5e8867fb31f0869f2c0b5aecfb11e098addbac0940ea487a6c8a0faca8148c"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.336440 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdn5h" event={"ID":"f9857afe-8e2a-4ddd-8fc2-2059bff84d86","Type":"ContainerStarted","Data":"8c22dea9a1822667e424e22af6189741fdcf3683f64498befb121e9a9e945e8b"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.336849 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.337132 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.337634 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.837612347 +0000 UTC m=+142.040107486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.337954 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.341333 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.841313524 +0000 UTC m=+142.043808662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.348782 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" event={"ID":"bebd549b-749f-4e30-ab21-99c32a85c0ca","Type":"ContainerStarted","Data":"fe70d7d3d23c52cef3d64ab5c2a66a37fe27040c26614df3f2785dbe8d7a962c"} Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.348832 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.392649 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.438744 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.440281 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:48.940264916 +0000 UTC m=+142.142760054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.505781 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:48 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:48 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:48 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.505881 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.541650 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.542038 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.04202548 +0000 UTC m=+142.244520618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.641744 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" podStartSLOduration=123.641725148 podStartE2EDuration="2m3.641725148s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.640492805 +0000 UTC m=+141.842987943" watchObservedRunningTime="2025-11-25 09:06:48.641725148 +0000 UTC m=+141.844220285" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.644124 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" podStartSLOduration=123.644114742 podStartE2EDuration="2m3.644114742s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.467561872 +0000 UTC m=+141.670057010" watchObservedRunningTime="2025-11-25 09:06:48.644114742 +0000 UTC m=+141.846609880" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.644832 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.645044 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.145000143 +0000 UTC m=+142.347495281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.645338 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.645694 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.145676993 +0000 UTC m=+142.348172131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.664921 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.747417 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.747735 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.24772094 +0000 UTC m=+142.450216078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.809011 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.809139 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.811096 4565 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-vj2db container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.17:8443/livez\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.811136 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" podUID="cd9f64b4-b7cb-4a24-9dfa-8eb5f1224b3c" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.17:8443/livez\": dial tcp 10.217.0.17:8443: connect: connection refused" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.818535 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c567c" podStartSLOduration=123.818525191 podStartE2EDuration="2m3.818525191s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.749471194 +0000 UTC m=+141.951966332" watchObservedRunningTime="2025-11-25 09:06:48.818525191 +0000 UTC m=+142.021020329" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.819328 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cgzl7" podStartSLOduration=123.819323199 podStartE2EDuration="2m3.819323199s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.816633671 +0000 UTC m=+142.019128809" watchObservedRunningTime="2025-11-25 09:06:48.819323199 +0000 UTC m=+142.021818327" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.848892 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.849130 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.349120165 +0000 UTC m=+142.551615304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.889660 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" podStartSLOduration=123.889650526 podStartE2EDuration="2m3.889650526s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.887537129 +0000 UTC m=+142.090032277" watchObservedRunningTime="2025-11-25 09:06:48.889650526 +0000 UTC m=+142.092145664" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.949460 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:48 crc kubenswrapper[4565]: E1125 09:06:48.949869 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.449858371 +0000 UTC m=+142.652353510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.987741 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v76jt" podStartSLOduration=123.987731454 podStartE2EDuration="2m3.987731454s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.976423805 +0000 UTC m=+142.178918942" watchObservedRunningTime="2025-11-25 09:06:48.987731454 +0000 UTC m=+142.190226592" Nov 25 09:06:48 crc kubenswrapper[4565]: I1125 09:06:48.988134 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" podStartSLOduration=123.988129662 podStartE2EDuration="2m3.988129662s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:48.952343155 +0000 UTC m=+142.154838293" watchObservedRunningTime="2025-11-25 09:06:48.988129662 +0000 UTC m=+142.190624800" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.050500 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.050817 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.55080587 +0000 UTC m=+142.753301008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.106034 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" podStartSLOduration=124.106018813 podStartE2EDuration="2m4.106018813s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.055771569 +0000 UTC m=+142.258266707" watchObservedRunningTime="2025-11-25 09:06:49.106018813 +0000 UTC m=+142.308513951" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.106191 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-57ghm" podStartSLOduration=125.106185446 podStartE2EDuration="2m5.106185446s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.09845542 +0000 UTC m=+142.300950558" watchObservedRunningTime="2025-11-25 09:06:49.106185446 +0000 UTC m=+142.308680585" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.134041 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4bqts" podStartSLOduration=124.13403009 podStartE2EDuration="2m4.13403009s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.132656781 +0000 UTC m=+142.335151920" watchObservedRunningTime="2025-11-25 09:06:49.13403009 +0000 UTC m=+142.336525227" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.151437 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.151776 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.65176486 +0000 UTC m=+142.854259999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.191149 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w788b" podStartSLOduration=124.191130073 podStartE2EDuration="2m4.191130073s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.184489251 +0000 UTC m=+142.386984390" watchObservedRunningTime="2025-11-25 09:06:49.191130073 +0000 UTC m=+142.393625211" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.281651 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.282077 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.782063409 +0000 UTC m=+142.984558547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.284270 4565 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-f8pjc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.284421 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" podUID="66eb132d-fe22-490c-bd19-3c52de3b56ee" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.301408 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bdn5h" podStartSLOduration=9.301392072 podStartE2EDuration="9.301392072s" podCreationTimestamp="2025-11-25 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.298651368 +0000 UTC m=+142.501146506" watchObservedRunningTime="2025-11-25 09:06:49.301392072 +0000 UTC m=+142.503887210" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.377388 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" podStartSLOduration=125.377368019 podStartE2EDuration="2m5.377368019s" podCreationTimestamp="2025-11-25 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:49.376284705 +0000 UTC m=+142.578779843" watchObservedRunningTime="2025-11-25 09:06:49.377368019 +0000 UTC m=+142.579863156" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.383441 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.384002 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.88398755 +0000 UTC m=+143.086482689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.395897 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" event={"ID":"12c5c094-a435-4607-9e17-82b36d40c672","Type":"ContainerStarted","Data":"d3da688def1de583eac09e1188ce6d78edea8984a472e234ec5fddf03edcd36a"} Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.415256 4565 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-f7mzc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.415310 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.441159 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9xwj" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.489372 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.495624 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:49.995611084 +0000 UTC m=+143.198106222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.506890 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:49 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:49 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:49 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.506980 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.590691 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.591118 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.091104457 +0000 UTC m=+143.293599596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.691962 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.692261 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.192250017 +0000 UTC m=+143.394745156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.793180 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.793534 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.293521064 +0000 UTC m=+143.496016202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.894310 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.894583 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.394571715 +0000 UTC m=+143.597066853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.971848 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f8pjc" Nov 25 09:06:49 crc kubenswrapper[4565]: I1125 09:06:49.995677 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:49 crc kubenswrapper[4565]: E1125 09:06:49.996256 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.496242702 +0000 UTC m=+143.698737840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.096999 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.097242 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.597231438 +0000 UTC m=+143.799726575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.198007 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.198395 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.698380796 +0000 UTC m=+143.900875934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.299218 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.299496 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.799485389 +0000 UTC m=+144.001980527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.399683 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.399773 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.899761377 +0000 UTC m=+144.102256516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.400056 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.400285 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:50.900277136 +0000 UTC m=+144.102772273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.402801 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" event={"ID":"12c5c094-a435-4607-9e17-82b36d40c672","Type":"ContainerStarted","Data":"72147cf3b477e999bea7c245d621cbe2d06f4b248c055abe772db9860bfd5b9d"} Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.405001 4565 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-f7mzc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.405028 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.500633 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.501598 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.001576133 +0000 UTC m=+144.204071271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.504851 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:50 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:50 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:50 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.505629 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.602622 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.603092 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.103078022 +0000 UTC m=+144.305573160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.704021 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.704294 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.20428069 +0000 UTC m=+144.406775828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.794510 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.795257 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.798982 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.805863 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.806160 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.30615101 +0000 UTC m=+144.508646148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.879999 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.906551 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.906749 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.906809 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.906923 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmfz\" (UniqueName: \"kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:50 crc kubenswrapper[4565]: E1125 09:06:50.907059 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.40704602 +0000 UTC m=+144.609541158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.989968 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.990869 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:50 crc kubenswrapper[4565]: I1125 09:06:50.992125 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.006295 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.008303 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.008347 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmfz\" (UniqueName: \"kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.008422 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.008447 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.008589 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.508576803 +0000 UTC m=+144.711071942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.009104 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.009500 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.029302 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmfz\" (UniqueName: \"kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz\") pod \"certified-operators-s9bxs\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.106600 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.108741 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.109071 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.109138 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.109164 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfpd\" (UniqueName: \"kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.109250 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.609236662 +0000 UTC m=+144.811731800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.210680 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.211430 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.211473 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.211492 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfpd\" (UniqueName: \"kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.212116 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.712100546 +0000 UTC m=+144.914595685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.212187 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.211382 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.217442 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.219222 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.230974 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.239674 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfpd\" (UniqueName: \"kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd\") pod \"community-operators-d66wt\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.301527 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.312164 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.312461 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.312488 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.312538 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzsgw\" (UniqueName: \"kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.312616 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.812602339 +0000 UTC m=+145.015097478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.395592 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.396382 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.403409 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.415388 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.415420 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.415535 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzsgw\" (UniqueName: \"kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.415740 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.416687 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.418280 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:51.916916536 +0000 UTC m=+145.119411664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.418527 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.441473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" event={"ID":"12c5c094-a435-4607-9e17-82b36d40c672","Type":"ContainerStarted","Data":"5ba8c64fe2fae97af3f3dcaefcf261cdedc5d84d337e01b42098bc0439a7a085"} Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.448577 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzsgw\" (UniqueName: \"kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw\") pod \"certified-operators-pwl47\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.458749 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-crrxv" podStartSLOduration=11.458739692 podStartE2EDuration="11.458739692s" podCreationTimestamp="2025-11-25 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:51.456839856 +0000 UTC m=+144.659334994" watchObservedRunningTime="2025-11-25 09:06:51.458739692 +0000 UTC m=+144.661234829" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.483738 4565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.507311 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:51 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:51 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:51 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.507355 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.517157 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.517383 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qnn\" (UniqueName: \"kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.517441 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.517489 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.518155 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.018143839 +0000 UTC m=+145.220638977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.556098 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.568478 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619140 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619179 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619272 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qnn\" (UniqueName: \"kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619323 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619595 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.619697 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.619847 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.119834652 +0000 UTC m=+145.322329791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.627455 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.634510 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qnn\" (UniqueName: \"kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn\") pod \"community-operators-qzjfq\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: W1125 09:06:51.638529 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f41ac7d_f98d_4d50_8346_fc730f41309c.slice/crio-bf6e1bcbc093040ed177f728f6baec304019320481496e9735ecc747e4c55d7d WatchSource:0}: Error finding container bf6e1bcbc093040ed177f728f6baec304019320481496e9735ecc747e4c55d7d: Status 404 returned error can't find the container with id bf6e1bcbc093040ed177f728f6baec304019320481496e9735ecc747e4c55d7d Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.719900 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.720048 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.220028988 +0000 UTC m=+145.422524125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.720398 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.720640 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.720666 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.220657016 +0000 UTC m=+145.423152155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.767679 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.821427 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.821791 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.321779384 +0000 UTC m=+145.524274522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.887534 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:06:51 crc kubenswrapper[4565]: W1125 09:06:51.893639 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d44c664_75e0_4020_a8f6_8bd5ac874798.slice/crio-da0b9c8d32461d1435c2e921aaf40c43735c3cca3544dcf4345e0ba568edda38 WatchSource:0}: Error finding container da0b9c8d32461d1435c2e921aaf40c43735c3cca3544dcf4345e0ba568edda38: Status 404 returned error can't find the container with id da0b9c8d32461d1435c2e921aaf40c43735c3cca3544dcf4345e0ba568edda38 Nov 25 09:06:51 crc kubenswrapper[4565]: I1125 09:06:51.922444 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:51 crc kubenswrapper[4565]: E1125 09:06:51.922851 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.422839173 +0000 UTC m=+145.625334311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.024457 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:52 crc kubenswrapper[4565]: E1125 09:06:52.024629 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.524614995 +0000 UTC m=+145.727110134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.024983 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: E1125 09:06:52.025252 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.525243525 +0000 UTC m=+145.727738663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.125982 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:52 crc kubenswrapper[4565]: E1125 09:06:52.126271 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.626262578 +0000 UTC m=+145.828757716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.227369 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: E1125 09:06:52.227712 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 09:06:52.72770194 +0000 UTC m=+145.930197078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fljns" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.236624 4565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-25T09:06:51.483755386Z","Handler":null,"Name":""} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.239284 4565 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.239412 4565 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.328645 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.331533 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.392345 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s7fbs" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.429893 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.431807 4565 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.431844 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.446302 4565 generic.go:334] "Generic (PLEG): container finished" podID="50c07298-4434-42c5-ab63-fe196405db28" containerID="747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45" exitCode=0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.446374 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerDied","Data":"747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.446399 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerStarted","Data":"e11fbaa7438dbcea99cc2a5db1e182efb215dd1c874dc0114fe9a42abc71813d"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.447818 4565 generic.go:334] "Generic (PLEG): container finished" podID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerID="4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28" exitCode=0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.447906 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerDied","Data":"4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.447952 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerStarted","Data":"65d770105ee6c5fcf476e1da3b7ba8a2185db25ec6649c8952e19c920cf5b288"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.448318 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.450053 4565 generic.go:334] "Generic (PLEG): container finished" podID="a888c987-e97d-4f33-9932-158161870fe6" containerID="0cef7b14ef30b0eb4abaff2d20167e00b58f773da50f35150e7ca1b544ff4265" exitCode=0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.450081 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" event={"ID":"a888c987-e97d-4f33-9932-158161870fe6","Type":"ContainerDied","Data":"0cef7b14ef30b0eb4abaff2d20167e00b58f773da50f35150e7ca1b544ff4265"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.451813 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fljns\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.452164 4565 generic.go:334] "Generic (PLEG): container finished" podID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerID="0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0" exitCode=0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.452203 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerDied","Data":"0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.452241 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerStarted","Data":"bf6e1bcbc093040ed177f728f6baec304019320481496e9735ecc747e4c55d7d"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.453538 4565 generic.go:334] "Generic (PLEG): container finished" podID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerID="d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891" exitCode=0 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.453562 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerDied","Data":"d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.453600 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerStarted","Data":"da0b9c8d32461d1435c2e921aaf40c43735c3cca3544dcf4345e0ba568edda38"} Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.506536 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:52 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:52 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:52 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.506583 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.583365 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.583415 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.585075 4565 patch_prober.go:28] interesting pod/console-f9d7485db-wvc8w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.585123 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wvc8w" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.602509 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.603087 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.607233 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.611419 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.632188 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.634524 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.634605 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.674574 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.737092 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.737155 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.737213 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.761662 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.783618 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.786558 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.788648 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.802806 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.837945 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxjc\" (UniqueName: \"kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.838032 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.838053 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.891476 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:06:52 crc kubenswrapper[4565]: W1125 09:06:52.896186 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755dab00_cc07_483e_82b6_8a3e54e6dee3.slice/crio-743dd1c94f52feaf1f2e03d5c604327d4d6216f8093a06c862117ea0a7cf1889 WatchSource:0}: Error finding container 743dd1c94f52feaf1f2e03d5c604327d4d6216f8093a06c862117ea0a7cf1889: Status 404 returned error can't find the container with id 743dd1c94f52feaf1f2e03d5c604327d4d6216f8093a06c862117ea0a7cf1889 Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.921233 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.940856 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.941007 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.941656 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.941823 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.941993 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxjc\" (UniqueName: \"kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:52 crc kubenswrapper[4565]: I1125 09:06:52.959405 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxjc\" (UniqueName: \"kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc\") pod \"redhat-marketplace-hxkk7\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.042521 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.042594 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.042631 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.042657 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.046093 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.047159 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.047530 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.048716 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.103291 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.117034 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.126128 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.126167 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.143477 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.186471 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.187876 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.206552 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.206976 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.217102 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.222055 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.246978 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.247036 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7m2\" (UniqueName: \"kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.247198 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.293683 4565 patch_prober.go:28] interesting pod/downloads-7954f5f757-wbp5x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.293764 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wbp5x" podUID="cd73c816-12d9-42bc-bc10-2180ed13b36a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.293807 4565 patch_prober.go:28] interesting pod/downloads-7954f5f757-wbp5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.293909 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wbp5x" podUID="cd73c816-12d9-42bc-bc10-2180ed13b36a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.348478 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.348560 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.348587 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7m2\" (UniqueName: \"kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.349800 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.351673 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.363356 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.385295 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7m2\" (UniqueName: \"kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2\") pod \"redhat-marketplace-hcxpx\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.470808 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" event={"ID":"755dab00-cc07-483e-82b6-8a3e54e6dee3","Type":"ContainerStarted","Data":"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243"} Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.470878 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" event={"ID":"755dab00-cc07-483e-82b6-8a3e54e6dee3","Type":"ContainerStarted","Data":"743dd1c94f52feaf1f2e03d5c604327d4d6216f8093a06c862117ea0a7cf1889"} Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.470972 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.473545 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb","Type":"ContainerStarted","Data":"7ab1d1818b8ae85a0f618f5403296908ecbc181d48948c42044d0ee915edd5af"} Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.487827 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vvqqw" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.502653 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.506922 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:53 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:53 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:53 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.507002 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.516105 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.548555 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" podStartSLOduration=128.548531004 podStartE2EDuration="2m8.548531004s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:53.501075829 +0000 UTC m=+146.703570967" watchObservedRunningTime="2025-11-25 09:06:53.548531004 +0000 UTC m=+146.751026142" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.668618 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:06:53 crc kubenswrapper[4565]: W1125 09:06:53.711410 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5846c394_bccc_4d13_8cea_d3deb171c550.slice/crio-4604a7c85adbaa945b400d5cf6f7de985ee5af30c28f46d45f1f10c8926e9543 WatchSource:0}: Error finding container 4604a7c85adbaa945b400d5cf6f7de985ee5af30c28f46d45f1f10c8926e9543: Status 404 returned error can't find the container with id 4604a7c85adbaa945b400d5cf6f7de985ee5af30c28f46d45f1f10c8926e9543 Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.832473 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.846045 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vj2db" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.940214 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.979956 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 09:06:53 crc kubenswrapper[4565]: I1125 09:06:53.981004 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.001893 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.002148 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.002315 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.025990 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.027994 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.048272 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.104559 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.188637 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.189646 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnwx\" (UniqueName: \"kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.189700 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.189814 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.189836 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.189867 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.240637 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291069 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291109 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291133 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291213 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnwx\" (UniqueName: \"kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291239 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291522 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291621 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.291916 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.319604 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnwx\" (UniqueName: \"kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx\") pod \"redhat-operators-j4tjs\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.328193 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.366923 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.392290 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume\") pod \"a888c987-e97d-4f33-9932-158161870fe6\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.392419 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume\") pod \"a888c987-e97d-4f33-9932-158161870fe6\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.392480 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xk9h\" (UniqueName: \"kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h\") pod \"a888c987-e97d-4f33-9932-158161870fe6\" (UID: \"a888c987-e97d-4f33-9932-158161870fe6\") " Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.394869 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume" (OuterVolumeSpecName: "config-volume") pod "a888c987-e97d-4f33-9932-158161870fe6" (UID: "a888c987-e97d-4f33-9932-158161870fe6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.400085 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a888c987-e97d-4f33-9932-158161870fe6" (UID: "a888c987-e97d-4f33-9932-158161870fe6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.409886 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.421494 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h" (OuterVolumeSpecName: "kube-api-access-5xk9h") pod "a888c987-e97d-4f33-9932-158161870fe6" (UID: "a888c987-e97d-4f33-9932-158161870fe6"). InnerVolumeSpecName "kube-api-access-5xk9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.429172 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:06:54 crc kubenswrapper[4565]: E1125 09:06:54.429405 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a888c987-e97d-4f33-9932-158161870fe6" containerName="collect-profiles" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.429423 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a888c987-e97d-4f33-9932-158161870fe6" containerName="collect-profiles" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.429535 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a888c987-e97d-4f33-9932-158161870fe6" containerName="collect-profiles" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.430575 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.432919 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.494939 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a888c987-e97d-4f33-9932-158161870fe6-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.494977 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xk9h\" (UniqueName: \"kubernetes.io/projected/a888c987-e97d-4f33-9932-158161870fe6-kube-api-access-5xk9h\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.494987 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a888c987-e97d-4f33-9932-158161870fe6-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.523646 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:54 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:54 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:54 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.523723 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.530487 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f99279f137737c58952a966fcb098c8d77124ed824b78a78e416293ada61ecbd"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.530817 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d7cc701391b13115c31473c45d3cc7328c38713b0b701301b0ac246026896868"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.545003 4565 generic.go:334] "Generic (PLEG): container finished" podID="5846c394-bccc-4d13-8cea-d3deb171c550" containerID="f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f" exitCode=0 Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.545077 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerDied","Data":"f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.545103 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerStarted","Data":"4604a7c85adbaa945b400d5cf6f7de985ee5af30c28f46d45f1f10c8926e9543"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.552637 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb","Type":"ContainerStarted","Data":"2d3ead8be3b0c3c7ee71b215acf6895cec3a59b53e3d79020f4145f15dd8188b"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.584702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" event={"ID":"a888c987-e97d-4f33-9932-158161870fe6","Type":"ContainerDied","Data":"a194d5f1f16bf081a4b2ca436ff71e4b5474979d0e082b6fc5639848ce9d1900"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.584754 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a194d5f1f16bf081a4b2ca436ff71e4b5474979d0e082b6fc5639848ce9d1900" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.584828 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.591046 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d40f5ebeddd3bb79311a500e9ae5c341cd4c46f06e746193990bf01c78603ea1"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.591084 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"827c2ba68defbf0f2af8521971f6f42cf488bafe92f12080f19d595be1a5115d"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.591696 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.595539 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.595592 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.595685 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bw7\" (UniqueName: \"kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.596688 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerStarted","Data":"7636c716a3f231ac6150060a6825707ea58b2bef784d3758902a293368db7eeb"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.609894 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.60988194 podStartE2EDuration="2.60988194s" podCreationTimestamp="2025-11-25 09:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:54.608753714 +0000 UTC m=+147.811248852" watchObservedRunningTime="2025-11-25 09:06:54.60988194 +0000 UTC m=+147.812377078" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.625477 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f39ffb2070c79ec010e425f858695cf8e543b20cf56facc404c51c037f4406eb"} Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.698051 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.698710 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bw7\" (UniqueName: \"kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.698823 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.699970 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.704557 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.729995 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bw7\" (UniqueName: \"kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7\") pod \"redhat-operators-hgwp8\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.772964 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:06:54 crc kubenswrapper[4565]: I1125 09:06:54.889399 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 09:06:54 crc kubenswrapper[4565]: W1125 09:06:54.914710 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod433060c6_c174_4a38_9a33_224b8abcab6a.slice/crio-dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85 WatchSource:0}: Error finding container dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85: Status 404 returned error can't find the container with id dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85 Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.099873 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.100959 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.165147 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:06:55 crc kubenswrapper[4565]: W1125 09:06:55.179095 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4191dbab_fe34_482b_a895_adef8dc1c4a0.slice/crio-9e5b56fff6e524773fff6688c2556ca4b5e149229b582c4c8e37e68ac6cef825 WatchSource:0}: Error finding container 9e5b56fff6e524773fff6688c2556ca4b5e149229b582c4c8e37e68ac6cef825: Status 404 returned error can't find the container with id 9e5b56fff6e524773fff6688c2556ca4b5e149229b582c4c8e37e68ac6cef825 Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.383120 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.504871 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:55 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:55 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:55 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.504913 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.654200 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"433060c6-c174-4a38-9a33-224b8abcab6a","Type":"ContainerStarted","Data":"1f0bf98c650cb4c273bad70721fb3c9d6a9e4055a361bb9d1ec0e7abeebc7e0e"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.654259 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"433060c6-c174-4a38-9a33-224b8abcab6a","Type":"ContainerStarted","Data":"dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.699185 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"608746e02b25342757b30e56e6a660330d8acc314053ea49524eede48b30c8c7"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.703783 4565 generic.go:334] "Generic (PLEG): container finished" podID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerID="db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f" exitCode=0 Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.703868 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerDied","Data":"db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.703894 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerStarted","Data":"9e5b56fff6e524773fff6688c2556ca4b5e149229b582c4c8e37e68ac6cef825"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.727102 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerStarted","Data":"0cbd2f7adc0de8c052b1c68c993adfa6ccd8eed8af3e482dd420082d7884572c"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.727207 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.726171415 podStartE2EDuration="2.726171415s" podCreationTimestamp="2025-11-25 09:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:06:55.674279263 +0000 UTC m=+148.876774401" watchObservedRunningTime="2025-11-25 09:06:55.726171415 +0000 UTC m=+148.928666552" Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.733686 4565 generic.go:334] "Generic (PLEG): container finished" podID="8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" containerID="2d3ead8be3b0c3c7ee71b215acf6895cec3a59b53e3d79020f4145f15dd8188b" exitCode=0 Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.733815 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb","Type":"ContainerDied","Data":"2d3ead8be3b0c3c7ee71b215acf6895cec3a59b53e3d79020f4145f15dd8188b"} Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.738025 4565 generic.go:334] "Generic (PLEG): container finished" podID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerID="0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492" exitCode=0 Nov 25 09:06:55 crc kubenswrapper[4565]: I1125 09:06:55.738594 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerDied","Data":"0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492"} Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.504311 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:56 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:56 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:56 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.504368 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.799130 4565 generic.go:334] "Generic (PLEG): container finished" podID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerID="72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe" exitCode=0 Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.799206 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerDied","Data":"72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe"} Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.808460 4565 generic.go:334] "Generic (PLEG): container finished" podID="433060c6-c174-4a38-9a33-224b8abcab6a" containerID="1f0bf98c650cb4c273bad70721fb3c9d6a9e4055a361bb9d1ec0e7abeebc7e0e" exitCode=0 Nov 25 09:06:56 crc kubenswrapper[4565]: I1125 09:06:56.808658 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"433060c6-c174-4a38-9a33-224b8abcab6a","Type":"ContainerDied","Data":"1f0bf98c650cb4c273bad70721fb3c9d6a9e4055a361bb9d1ec0e7abeebc7e0e"} Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.416118 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.509287 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:57 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:57 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:57 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.509334 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.552460 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir\") pod \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.552566 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access\") pod \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\" (UID: \"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb\") " Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.561757 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" (UID: "8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.562324 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" (UID: "8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.654572 4565 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.654608 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.822088 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.822245 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb","Type":"ContainerDied","Data":"7ab1d1818b8ae85a0f618f5403296908ecbc181d48948c42044d0ee915edd5af"} Nov 25 09:06:57 crc kubenswrapper[4565]: I1125 09:06:57.822283 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ab1d1818b8ae85a0f618f5403296908ecbc181d48948c42044d0ee915edd5af" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.125953 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.289063 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access\") pod \"433060c6-c174-4a38-9a33-224b8abcab6a\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.289169 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir\") pod \"433060c6-c174-4a38-9a33-224b8abcab6a\" (UID: \"433060c6-c174-4a38-9a33-224b8abcab6a\") " Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.289452 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "433060c6-c174-4a38-9a33-224b8abcab6a" (UID: "433060c6-c174-4a38-9a33-224b8abcab6a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.295490 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "433060c6-c174-4a38-9a33-224b8abcab6a" (UID: "433060c6-c174-4a38-9a33-224b8abcab6a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.390653 4565 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/433060c6-c174-4a38-9a33-224b8abcab6a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.390679 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/433060c6-c174-4a38-9a33-224b8abcab6a-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.504584 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:58 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:58 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:58 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.504630 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.666468 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bdn5h" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.835977 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"433060c6-c174-4a38-9a33-224b8abcab6a","Type":"ContainerDied","Data":"dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85"} Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.836012 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd417bd90ecd03c7722e800f208e56cec8ecad17bff23f18173c496432f34d85" Nov 25 09:06:58 crc kubenswrapper[4565]: I1125 09:06:58.836018 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 09:06:59 crc kubenswrapper[4565]: I1125 09:06:59.504954 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:06:59 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:06:59 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:06:59 crc kubenswrapper[4565]: healthz check failed Nov 25 09:06:59 crc kubenswrapper[4565]: I1125 09:06:59.505208 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:07:00 crc kubenswrapper[4565]: I1125 09:07:00.505154 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:07:00 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:07:00 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:07:00 crc kubenswrapper[4565]: healthz check failed Nov 25 09:07:00 crc kubenswrapper[4565]: I1125 09:07:00.505218 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:07:01 crc kubenswrapper[4565]: I1125 09:07:01.506542 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:07:01 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:07:01 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:07:01 crc kubenswrapper[4565]: healthz check failed Nov 25 09:07:01 crc kubenswrapper[4565]: I1125 09:07:01.506609 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:07:02 crc kubenswrapper[4565]: I1125 09:07:02.504823 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:07:02 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:07:02 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:07:02 crc kubenswrapper[4565]: healthz check failed Nov 25 09:07:02 crc kubenswrapper[4565]: I1125 09:07:02.505030 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:07:02 crc kubenswrapper[4565]: I1125 09:07:02.584123 4565 patch_prober.go:28] interesting pod/console-f9d7485db-wvc8w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 09:07:02 crc kubenswrapper[4565]: I1125 09:07:02.584171 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wvc8w" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 09:07:03 crc kubenswrapper[4565]: I1125 09:07:03.297582 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wbp5x" Nov 25 09:07:03 crc kubenswrapper[4565]: I1125 09:07:03.505303 4565 patch_prober.go:28] interesting pod/router-default-5444994796-h5ktx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 09:07:03 crc kubenswrapper[4565]: [-]has-synced failed: reason withheld Nov 25 09:07:03 crc kubenswrapper[4565]: [+]process-running ok Nov 25 09:07:03 crc kubenswrapper[4565]: healthz check failed Nov 25 09:07:03 crc kubenswrapper[4565]: I1125 09:07:03.505356 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5ktx" podUID="66cd3426-ad3a-499d-8b09-056348e1413a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 09:07:04 crc kubenswrapper[4565]: I1125 09:07:04.505536 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:07:04 crc kubenswrapper[4565]: I1125 09:07:04.508308 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h5ktx" Nov 25 09:07:06 crc kubenswrapper[4565]: I1125 09:07:06.926690 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:07:06 crc kubenswrapper[4565]: I1125 09:07:06.931855 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5b047b2-31c7-45e7-a944-8d9c6de61061-metrics-certs\") pod \"network-metrics-daemon-fzpzk\" (UID: \"b5b047b2-31c7-45e7-a944-8d9c6de61061\") " pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:07:07 crc kubenswrapper[4565]: I1125 09:07:07.012478 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fzpzk" Nov 25 09:07:07 crc kubenswrapper[4565]: I1125 09:07:07.459769 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fzpzk"] Nov 25 09:07:07 crc kubenswrapper[4565]: W1125 09:07:07.467308 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b047b2_31c7_45e7_a944_8d9c6de61061.slice/crio-603d742dea2ce67d512775da3803d9b1236ed611908c44e4a5fb02d02030be70 WatchSource:0}: Error finding container 603d742dea2ce67d512775da3803d9b1236ed611908c44e4a5fb02d02030be70: Status 404 returned error can't find the container with id 603d742dea2ce67d512775da3803d9b1236ed611908c44e4a5fb02d02030be70 Nov 25 09:07:07 crc kubenswrapper[4565]: I1125 09:07:07.935315 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" event={"ID":"b5b047b2-31c7-45e7-a944-8d9c6de61061","Type":"ContainerStarted","Data":"71cd94d439be5a39b9f9657030e01737ac1e55e4a3a64434d1f7dd6b15227664"} Nov 25 09:07:07 crc kubenswrapper[4565]: I1125 09:07:07.935629 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" event={"ID":"b5b047b2-31c7-45e7-a944-8d9c6de61061","Type":"ContainerStarted","Data":"603d742dea2ce67d512775da3803d9b1236ed611908c44e4a5fb02d02030be70"} Nov 25 09:07:08 crc kubenswrapper[4565]: I1125 09:07:08.941807 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fzpzk" event={"ID":"b5b047b2-31c7-45e7-a944-8d9c6de61061","Type":"ContainerStarted","Data":"c24765f525854c81510f3bfef6901134a5952775101cd3d1a81a47da7448ce83"} Nov 25 09:07:12 crc kubenswrapper[4565]: I1125 09:07:12.587921 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:07:12 crc kubenswrapper[4565]: I1125 09:07:12.593325 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:07:12 crc kubenswrapper[4565]: I1125 09:07:12.604603 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fzpzk" podStartSLOduration=147.604582166 podStartE2EDuration="2m27.604582166s" podCreationTimestamp="2025-11-25 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:07:08.953477604 +0000 UTC m=+162.155972742" watchObservedRunningTime="2025-11-25 09:07:12.604582166 +0000 UTC m=+165.807077303" Nov 25 09:07:12 crc kubenswrapper[4565]: I1125 09:07:12.680189 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:07:18 crc kubenswrapper[4565]: E1125 09:07:18.356525 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 09:07:18 crc kubenswrapper[4565]: E1125 09:07:18.358145 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6f7m2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hcxpx_openshift-marketplace(6e1cda59-b195-4f26-9b05-156ac97681d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 09:07:18 crc kubenswrapper[4565]: E1125 09:07:18.359431 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hcxpx" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" Nov 25 09:07:18 crc kubenswrapper[4565]: I1125 09:07:18.986784 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerStarted","Data":"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3"} Nov 25 09:07:18 crc kubenswrapper[4565]: I1125 09:07:18.991330 4565 generic.go:334] "Generic (PLEG): container finished" podID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerID="c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925" exitCode=0 Nov 25 09:07:18 crc kubenswrapper[4565]: I1125 09:07:18.991775 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerDied","Data":"c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925"} Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:18.999502 4565 generic.go:334] "Generic (PLEG): container finished" podID="50c07298-4434-42c5-ab63-fe196405db28" containerID="912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205" exitCode=0 Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:18.999575 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerDied","Data":"912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205"} Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.001389 4565 generic.go:334] "Generic (PLEG): container finished" podID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerID="2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56" exitCode=0 Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.001423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerDied","Data":"2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56"} Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.006627 4565 generic.go:334] "Generic (PLEG): container finished" podID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerID="58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2" exitCode=0 Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.006674 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerDied","Data":"58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2"} Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.013130 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerStarted","Data":"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4"} Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.016878 4565 generic.go:334] "Generic (PLEG): container finished" podID="5846c394-bccc-4d13-8cea-d3deb171c550" containerID="acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048" exitCode=0 Nov 25 09:07:19 crc kubenswrapper[4565]: I1125 09:07:19.017853 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerDied","Data":"acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048"} Nov 25 09:07:19 crc kubenswrapper[4565]: E1125 09:07:19.019354 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hcxpx" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.029819 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerStarted","Data":"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.031713 4565 generic.go:334] "Generic (PLEG): container finished" podID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerID="3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4" exitCode=0 Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.031786 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerDied","Data":"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.033737 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerStarted","Data":"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.035876 4565 generic.go:334] "Generic (PLEG): container finished" podID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerID="1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3" exitCode=0 Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.035967 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerDied","Data":"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.040675 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerStarted","Data":"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.044420 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerStarted","Data":"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.048505 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerStarted","Data":"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af"} Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.055624 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d66wt" podStartSLOduration=2.997020818 podStartE2EDuration="30.055613638s" podCreationTimestamp="2025-11-25 09:06:50 +0000 UTC" firstStartedPulling="2025-11-25 09:06:52.448578688 +0000 UTC m=+145.651073827" lastFinishedPulling="2025-11-25 09:07:19.507171509 +0000 UTC m=+172.709666647" observedRunningTime="2025-11-25 09:07:20.053079161 +0000 UTC m=+173.255574300" watchObservedRunningTime="2025-11-25 09:07:20.055613638 +0000 UTC m=+173.258108776" Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.090212 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hxkk7" podStartSLOduration=3.044113681 podStartE2EDuration="28.090183712s" podCreationTimestamp="2025-11-25 09:06:52 +0000 UTC" firstStartedPulling="2025-11-25 09:06:54.550157681 +0000 UTC m=+147.752652819" lastFinishedPulling="2025-11-25 09:07:19.596227712 +0000 UTC m=+172.798722850" observedRunningTime="2025-11-25 09:07:20.088945098 +0000 UTC m=+173.291440236" watchObservedRunningTime="2025-11-25 09:07:20.090183712 +0000 UTC m=+173.292678850" Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.105940 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9bxs" podStartSLOduration=3.078992075 podStartE2EDuration="30.105896099s" podCreationTimestamp="2025-11-25 09:06:50 +0000 UTC" firstStartedPulling="2025-11-25 09:06:52.453448247 +0000 UTC m=+145.655943374" lastFinishedPulling="2025-11-25 09:07:19.48035226 +0000 UTC m=+172.682847398" observedRunningTime="2025-11-25 09:07:20.101669778 +0000 UTC m=+173.304164916" watchObservedRunningTime="2025-11-25 09:07:20.105896099 +0000 UTC m=+173.308391237" Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.125987 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qzjfq" podStartSLOduration=2.108375495 podStartE2EDuration="29.12597067s" podCreationTimestamp="2025-11-25 09:06:51 +0000 UTC" firstStartedPulling="2025-11-25 09:06:52.455658634 +0000 UTC m=+145.658153772" lastFinishedPulling="2025-11-25 09:07:19.473253809 +0000 UTC m=+172.675748947" observedRunningTime="2025-11-25 09:07:20.12471844 +0000 UTC m=+173.327213578" watchObservedRunningTime="2025-11-25 09:07:20.12597067 +0000 UTC m=+173.328465809" Nov 25 09:07:20 crc kubenswrapper[4565]: I1125 09:07:20.140961 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pwl47" podStartSLOduration=2.09516708 podStartE2EDuration="29.140951835s" podCreationTimestamp="2025-11-25 09:06:51 +0000 UTC" firstStartedPulling="2025-11-25 09:06:52.448070124 +0000 UTC m=+145.650565262" lastFinishedPulling="2025-11-25 09:07:19.493854879 +0000 UTC m=+172.696350017" observedRunningTime="2025-11-25 09:07:20.139094258 +0000 UTC m=+173.341589397" watchObservedRunningTime="2025-11-25 09:07:20.140951835 +0000 UTC m=+173.343446972" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.058538 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerStarted","Data":"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09"} Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.060473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerStarted","Data":"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72"} Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.085271 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j4tjs" podStartSLOduration=3.318522676 podStartE2EDuration="28.085247435s" podCreationTimestamp="2025-11-25 09:06:53 +0000 UTC" firstStartedPulling="2025-11-25 09:06:55.707126144 +0000 UTC m=+148.909621283" lastFinishedPulling="2025-11-25 09:07:20.473850904 +0000 UTC m=+173.676346042" observedRunningTime="2025-11-25 09:07:21.084569422 +0000 UTC m=+174.287064560" watchObservedRunningTime="2025-11-25 09:07:21.085247435 +0000 UTC m=+174.287742573" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.106765 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.106813 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.302254 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.302389 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.569633 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.569832 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.721630 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:21 crc kubenswrapper[4565]: I1125 09:07:21.721903 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:22 crc kubenswrapper[4565]: I1125 09:07:22.262137 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s9bxs" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:22 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:22 crc kubenswrapper[4565]: > Nov 25 09:07:22 crc kubenswrapper[4565]: I1125 09:07:22.329202 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-d66wt" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:22 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:22 crc kubenswrapper[4565]: > Nov 25 09:07:22 crc kubenswrapper[4565]: I1125 09:07:22.606882 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pwl47" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:22 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:22 crc kubenswrapper[4565]: > Nov 25 09:07:22 crc kubenswrapper[4565]: I1125 09:07:22.752096 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qzjfq" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:22 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:22 crc kubenswrapper[4565]: > Nov 25 09:07:23 crc kubenswrapper[4565]: I1125 09:07:23.117124 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:07:23 crc kubenswrapper[4565]: I1125 09:07:23.117163 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:07:23 crc kubenswrapper[4565]: I1125 09:07:23.150695 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:07:23 crc kubenswrapper[4565]: I1125 09:07:23.164618 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hgwp8" podStartSLOduration=5.428711632 podStartE2EDuration="29.164603986s" podCreationTimestamp="2025-11-25 09:06:54 +0000 UTC" firstStartedPulling="2025-11-25 09:06:56.800552062 +0000 UTC m=+150.003047201" lastFinishedPulling="2025-11-25 09:07:20.536444418 +0000 UTC m=+173.738939555" observedRunningTime="2025-11-25 09:07:21.108641496 +0000 UTC m=+174.311136634" watchObservedRunningTime="2025-11-25 09:07:23.164603986 +0000 UTC m=+176.367099124" Nov 25 09:07:23 crc kubenswrapper[4565]: I1125 09:07:23.594467 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5c4hq" Nov 25 09:07:24 crc kubenswrapper[4565]: I1125 09:07:24.118157 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:07:24 crc kubenswrapper[4565]: I1125 09:07:24.410449 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:07:24 crc kubenswrapper[4565]: I1125 09:07:24.410490 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:07:24 crc kubenswrapper[4565]: I1125 09:07:24.774492 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:24 crc kubenswrapper[4565]: I1125 09:07:24.775033 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:25 crc kubenswrapper[4565]: I1125 09:07:25.099075 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:07:25 crc kubenswrapper[4565]: I1125 09:07:25.099137 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:07:25 crc kubenswrapper[4565]: I1125 09:07:25.442121 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j4tjs" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:25 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:25 crc kubenswrapper[4565]: > Nov 25 09:07:25 crc kubenswrapper[4565]: I1125 09:07:25.807366 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hgwp8" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="registry-server" probeResult="failure" output=< Nov 25 09:07:25 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:07:25 crc kubenswrapper[4565]: > Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.134323 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.163802 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.328527 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.352132 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.593706 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.619006 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.758750 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.782124 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:07:31 crc kubenswrapper[4565]: I1125 09:07:31.792240 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:32 crc kubenswrapper[4565]: I1125 09:07:32.102569 4565 generic.go:334] "Generic (PLEG): container finished" podID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerID="dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b" exitCode=0 Nov 25 09:07:32 crc kubenswrapper[4565]: I1125 09:07:32.102705 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerDied","Data":"dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b"} Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.108074 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerStarted","Data":"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb"} Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.119031 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.119169 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pwl47" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="registry-server" containerID="cri-o://b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799" gracePeriod=2 Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.123743 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hcxpx" podStartSLOduration=3.323202207 podStartE2EDuration="40.123730751s" podCreationTimestamp="2025-11-25 09:06:53 +0000 UTC" firstStartedPulling="2025-11-25 09:06:55.745845866 +0000 UTC m=+148.948341004" lastFinishedPulling="2025-11-25 09:07:32.54637441 +0000 UTC m=+185.748869548" observedRunningTime="2025-11-25 09:07:33.121212936 +0000 UTC m=+186.323708073" watchObservedRunningTime="2025-11-25 09:07:33.123730751 +0000 UTC m=+186.326225889" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.228705 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.485071 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.517567 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.517607 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.544800 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.663552 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities\") pod \"50c07298-4434-42c5-ab63-fe196405db28\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.663609 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzsgw\" (UniqueName: \"kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw\") pod \"50c07298-4434-42c5-ab63-fe196405db28\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.663662 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content\") pod \"50c07298-4434-42c5-ab63-fe196405db28\" (UID: \"50c07298-4434-42c5-ab63-fe196405db28\") " Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.664322 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities" (OuterVolumeSpecName: "utilities") pod "50c07298-4434-42c5-ab63-fe196405db28" (UID: "50c07298-4434-42c5-ab63-fe196405db28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.668623 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw" (OuterVolumeSpecName: "kube-api-access-jzsgw") pod "50c07298-4434-42c5-ab63-fe196405db28" (UID: "50c07298-4434-42c5-ab63-fe196405db28"). InnerVolumeSpecName "kube-api-access-jzsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.696600 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50c07298-4434-42c5-ab63-fe196405db28" (UID: "50c07298-4434-42c5-ab63-fe196405db28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.720264 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.720468 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qzjfq" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="registry-server" containerID="cri-o://707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757" gracePeriod=2 Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.765054 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.765081 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzsgw\" (UniqueName: \"kubernetes.io/projected/50c07298-4434-42c5-ab63-fe196405db28-kube-api-access-jzsgw\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:33 crc kubenswrapper[4565]: I1125 09:07:33.765092 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c07298-4434-42c5-ab63-fe196405db28-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.071097 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.113750 4565 generic.go:334] "Generic (PLEG): container finished" podID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerID="707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757" exitCode=0 Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.113814 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzjfq" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.113820 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerDied","Data":"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757"} Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.113847 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzjfq" event={"ID":"9d44c664-75e0-4020-a8f6-8bd5ac874798","Type":"ContainerDied","Data":"da0b9c8d32461d1435c2e921aaf40c43735c3cca3544dcf4345e0ba568edda38"} Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.113875 4565 scope.go:117] "RemoveContainer" containerID="707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.116274 4565 generic.go:334] "Generic (PLEG): container finished" podID="50c07298-4434-42c5-ab63-fe196405db28" containerID="b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799" exitCode=0 Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.116978 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwl47" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.117215 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerDied","Data":"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799"} Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.117261 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwl47" event={"ID":"50c07298-4434-42c5-ab63-fe196405db28","Type":"ContainerDied","Data":"e11fbaa7438dbcea99cc2a5db1e182efb215dd1c874dc0114fe9a42abc71813d"} Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.130251 4565 scope.go:117] "RemoveContainer" containerID="c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.140768 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.142669 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pwl47"] Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.152550 4565 scope.go:117] "RemoveContainer" containerID="d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.163483 4565 scope.go:117] "RemoveContainer" containerID="707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.163760 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757\": container with ID starting with 707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757 not found: ID does not exist" containerID="707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.163787 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757"} err="failed to get container status \"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757\": rpc error: code = NotFound desc = could not find container \"707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757\": container with ID starting with 707a09ae997c9b1b89378ef76c1fe22645b61d8dff64c1de3a7e6d9d2095a757 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.163820 4565 scope.go:117] "RemoveContainer" containerID="c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.164084 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925\": container with ID starting with c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925 not found: ID does not exist" containerID="c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.164105 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925"} err="failed to get container status \"c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925\": rpc error: code = NotFound desc = could not find container \"c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925\": container with ID starting with c4189774383d9536bb065a03b6b6238bfaab863b56938609dc4e9f8ddb12b925 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.164123 4565 scope.go:117] "RemoveContainer" containerID="d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.164441 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891\": container with ID starting with d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891 not found: ID does not exist" containerID="d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.164460 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891"} err="failed to get container status \"d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891\": rpc error: code = NotFound desc = could not find container \"d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891\": container with ID starting with d7a26242e4411044b947b6d6dd3199f2728121dbbf06d9cc534dff81e8356891 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.164474 4565 scope.go:117] "RemoveContainer" containerID="b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.169238 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2qnn\" (UniqueName: \"kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn\") pod \"9d44c664-75e0-4020-a8f6-8bd5ac874798\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.169292 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content\") pod \"9d44c664-75e0-4020-a8f6-8bd5ac874798\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.169318 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities\") pod \"9d44c664-75e0-4020-a8f6-8bd5ac874798\" (UID: \"9d44c664-75e0-4020-a8f6-8bd5ac874798\") " Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.170847 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities" (OuterVolumeSpecName: "utilities") pod "9d44c664-75e0-4020-a8f6-8bd5ac874798" (UID: "9d44c664-75e0-4020-a8f6-8bd5ac874798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.174141 4565 scope.go:117] "RemoveContainer" containerID="912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.176376 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn" (OuterVolumeSpecName: "kube-api-access-q2qnn") pod "9d44c664-75e0-4020-a8f6-8bd5ac874798" (UID: "9d44c664-75e0-4020-a8f6-8bd5ac874798"). InnerVolumeSpecName "kube-api-access-q2qnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.188695 4565 scope.go:117] "RemoveContainer" containerID="747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.200445 4565 scope.go:117] "RemoveContainer" containerID="b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.200701 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799\": container with ID starting with b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799 not found: ID does not exist" containerID="b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.200731 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799"} err="failed to get container status \"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799\": rpc error: code = NotFound desc = could not find container \"b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799\": container with ID starting with b2b4e732d839e1803721e4d3c1e3f549e3f7ade4173a607b45da1196dc0bb799 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.200749 4565 scope.go:117] "RemoveContainer" containerID="912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.200947 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205\": container with ID starting with 912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205 not found: ID does not exist" containerID="912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.200967 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205"} err="failed to get container status \"912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205\": rpc error: code = NotFound desc = could not find container \"912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205\": container with ID starting with 912e1a091471d055b9205c8a1cb4581d3a131b58ab0c7a1a169ec2a68469b205 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.200983 4565 scope.go:117] "RemoveContainer" containerID="747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45" Nov 25 09:07:34 crc kubenswrapper[4565]: E1125 09:07:34.201235 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45\": container with ID starting with 747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45 not found: ID does not exist" containerID="747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.201256 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45"} err="failed to get container status \"747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45\": rpc error: code = NotFound desc = could not find container \"747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45\": container with ID starting with 747f2c7033e6ec153cc18be4fc5acd8942e38f1262e43aed01b02d61174e6e45 not found: ID does not exist" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.213277 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d44c664-75e0-4020-a8f6-8bd5ac874798" (UID: "9d44c664-75e0-4020-a8f6-8bd5ac874798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.270587 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2qnn\" (UniqueName: \"kubernetes.io/projected/9d44c664-75e0-4020-a8f6-8bd5ac874798-kube-api-access-q2qnn\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.270614 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.270625 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d44c664-75e0-4020-a8f6-8bd5ac874798-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.432621 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.437570 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qzjfq"] Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.447558 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.480110 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.799201 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:34 crc kubenswrapper[4565]: I1125 09:07:34.824366 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:35 crc kubenswrapper[4565]: I1125 09:07:35.102393 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c07298-4434-42c5-ab63-fe196405db28" path="/var/lib/kubelet/pods/50c07298-4434-42c5-ab63-fe196405db28/volumes" Nov 25 09:07:35 crc kubenswrapper[4565]: I1125 09:07:35.102957 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" path="/var/lib/kubelet/pods/9d44c664-75e0-4020-a8f6-8bd5ac874798/volumes" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.118473 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.118836 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hgwp8" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="registry-server" containerID="cri-o://cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72" gracePeriod=2 Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.452434 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.614069 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9bw7\" (UniqueName: \"kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7\") pod \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.614118 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content\") pod \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.614150 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities\") pod \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\" (UID: \"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1\") " Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.614794 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities" (OuterVolumeSpecName: "utilities") pod "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" (UID: "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.618323 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7" (OuterVolumeSpecName: "kube-api-access-v9bw7") pod "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" (UID: "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1"). InnerVolumeSpecName "kube-api-access-v9bw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.676122 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" (UID: "c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.715199 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.715226 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9bw7\" (UniqueName: \"kubernetes.io/projected/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-kube-api-access-v9bw7\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:38 crc kubenswrapper[4565]: I1125 09:07:38.715237 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.138978 4565 generic.go:334] "Generic (PLEG): container finished" podID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerID="cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72" exitCode=0 Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.139016 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerDied","Data":"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72"} Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.139040 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgwp8" event={"ID":"c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1","Type":"ContainerDied","Data":"0cbd2f7adc0de8c052b1c68c993adfa6ccd8eed8af3e482dd420082d7884572c"} Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.139037 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgwp8" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.139055 4565 scope.go:117] "RemoveContainer" containerID="cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.150414 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.152940 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hgwp8"] Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.153396 4565 scope.go:117] "RemoveContainer" containerID="1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.167762 4565 scope.go:117] "RemoveContainer" containerID="72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.176923 4565 scope.go:117] "RemoveContainer" containerID="cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72" Nov 25 09:07:39 crc kubenswrapper[4565]: E1125 09:07:39.178120 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72\": container with ID starting with cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72 not found: ID does not exist" containerID="cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.178158 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72"} err="failed to get container status \"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72\": rpc error: code = NotFound desc = could not find container \"cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72\": container with ID starting with cc92db78dd94eb88663533cbb1f1a2f52a3fc4ad9f5ee8b3d663d68af1733b72 not found: ID does not exist" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.178198 4565 scope.go:117] "RemoveContainer" containerID="1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3" Nov 25 09:07:39 crc kubenswrapper[4565]: E1125 09:07:39.178586 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3\": container with ID starting with 1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3 not found: ID does not exist" containerID="1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.178626 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3"} err="failed to get container status \"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3\": rpc error: code = NotFound desc = could not find container \"1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3\": container with ID starting with 1f58c324e31e333b8e99b8e088313348c6f8eab397b43f0684b1be5f2a48a5e3 not found: ID does not exist" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.178639 4565 scope.go:117] "RemoveContainer" containerID="72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe" Nov 25 09:07:39 crc kubenswrapper[4565]: E1125 09:07:39.178864 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe\": container with ID starting with 72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe not found: ID does not exist" containerID="72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe" Nov 25 09:07:39 crc kubenswrapper[4565]: I1125 09:07:39.178892 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe"} err="failed to get container status \"72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe\": rpc error: code = NotFound desc = could not find container \"72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe\": container with ID starting with 72caca8b264ddee000536dbc3a66fbef4955c6f4f13e492296527d5d6cecfbfe not found: ID does not exist" Nov 25 09:07:41 crc kubenswrapper[4565]: I1125 09:07:41.102194 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" path="/var/lib/kubelet/pods/c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1/volumes" Nov 25 09:07:43 crc kubenswrapper[4565]: I1125 09:07:43.541025 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:47 crc kubenswrapper[4565]: I1125 09:07:47.518998 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:07:47 crc kubenswrapper[4565]: I1125 09:07:47.519499 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hcxpx" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="registry-server" containerID="cri-o://119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb" gracePeriod=2 Nov 25 09:07:47 crc kubenswrapper[4565]: I1125 09:07:47.825480 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.005125 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content\") pod \"6e1cda59-b195-4f26-9b05-156ac97681d5\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.005389 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities\") pod \"6e1cda59-b195-4f26-9b05-156ac97681d5\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.005414 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7m2\" (UniqueName: \"kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2\") pod \"6e1cda59-b195-4f26-9b05-156ac97681d5\" (UID: \"6e1cda59-b195-4f26-9b05-156ac97681d5\") " Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.005947 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities" (OuterVolumeSpecName: "utilities") pod "6e1cda59-b195-4f26-9b05-156ac97681d5" (UID: "6e1cda59-b195-4f26-9b05-156ac97681d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.009372 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2" (OuterVolumeSpecName: "kube-api-access-6f7m2") pod "6e1cda59-b195-4f26-9b05-156ac97681d5" (UID: "6e1cda59-b195-4f26-9b05-156ac97681d5"). InnerVolumeSpecName "kube-api-access-6f7m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.016645 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e1cda59-b195-4f26-9b05-156ac97681d5" (UID: "6e1cda59-b195-4f26-9b05-156ac97681d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.106887 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.106918 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7m2\" (UniqueName: \"kubernetes.io/projected/6e1cda59-b195-4f26-9b05-156ac97681d5-kube-api-access-6f7m2\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.106961 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1cda59-b195-4f26-9b05-156ac97681d5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.173822 4565 generic.go:334] "Generic (PLEG): container finished" podID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerID="119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb" exitCode=0 Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.173860 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerDied","Data":"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb"} Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.173886 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcxpx" event={"ID":"6e1cda59-b195-4f26-9b05-156ac97681d5","Type":"ContainerDied","Data":"7636c716a3f231ac6150060a6825707ea58b2bef784d3758902a293368db7eeb"} Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.173903 4565 scope.go:117] "RemoveContainer" containerID="119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.174020 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcxpx" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.186539 4565 scope.go:117] "RemoveContainer" containerID="dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.193323 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.197970 4565 scope.go:117] "RemoveContainer" containerID="0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.213727 4565 scope.go:117] "RemoveContainer" containerID="119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb" Nov 25 09:07:48 crc kubenswrapper[4565]: E1125 09:07:48.214105 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb\": container with ID starting with 119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb not found: ID does not exist" containerID="119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.214140 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb"} err="failed to get container status \"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb\": rpc error: code = NotFound desc = could not find container \"119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb\": container with ID starting with 119d5bc3b6f8ada5bb9a76851f31fddb7e089403e19425bb518aa7ef8d235feb not found: ID does not exist" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.214159 4565 scope.go:117] "RemoveContainer" containerID="dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b" Nov 25 09:07:48 crc kubenswrapper[4565]: E1125 09:07:48.214493 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b\": container with ID starting with dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b not found: ID does not exist" containerID="dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.214510 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b"} err="failed to get container status \"dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b\": rpc error: code = NotFound desc = could not find container \"dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b\": container with ID starting with dce7c5509d2bb2e0bebfbac6d900344452992909fe922eac4eea99fa8769d45b not found: ID does not exist" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.214522 4565 scope.go:117] "RemoveContainer" containerID="0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492" Nov 25 09:07:48 crc kubenswrapper[4565]: E1125 09:07:48.214746 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492\": container with ID starting with 0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492 not found: ID does not exist" containerID="0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.214760 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492"} err="failed to get container status \"0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492\": rpc error: code = NotFound desc = could not find container \"0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492\": container with ID starting with 0928bb4e735affa92e29ef183cd726e4fa2340634c641c66e3c5b8e34d147492 not found: ID does not exist" Nov 25 09:07:48 crc kubenswrapper[4565]: I1125 09:07:48.215115 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcxpx"] Nov 25 09:07:49 crc kubenswrapper[4565]: I1125 09:07:49.102303 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" path="/var/lib/kubelet/pods/6e1cda59-b195-4f26-9b05-156ac97681d5/volumes" Nov 25 09:07:55 crc kubenswrapper[4565]: I1125 09:07:55.099254 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:07:55 crc kubenswrapper[4565]: I1125 09:07:55.099495 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:07:55 crc kubenswrapper[4565]: I1125 09:07:55.102192 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:07:55 crc kubenswrapper[4565]: I1125 09:07:55.103334 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:07:55 crc kubenswrapper[4565]: I1125 09:07:55.103405 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708" gracePeriod=600 Nov 25 09:07:56 crc kubenswrapper[4565]: I1125 09:07:56.202339 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708" exitCode=0 Nov 25 09:07:56 crc kubenswrapper[4565]: I1125 09:07:56.202411 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708"} Nov 25 09:07:56 crc kubenswrapper[4565]: I1125 09:07:56.202507 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93"} Nov 25 09:07:56 crc kubenswrapper[4565]: I1125 09:07:56.802113 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerName="oauth-openshift" containerID="cri-o://e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a" gracePeriod=15 Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.071339 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198040 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198094 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5gvw\" (UniqueName: \"kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198152 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198167 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198188 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198204 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198219 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198235 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198255 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198476 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198492 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198513 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198528 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198543 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs\") pod \"29e91702-afc2-470f-a3b9-9be851b01f9c\" (UID: \"29e91702-afc2-470f-a3b9-9be851b01f9c\") " Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198689 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198705 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198742 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198796 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198808 4565 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.198818 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.199285 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.203197 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.203488 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.203579 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.203721 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.203843 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.204109 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.204259 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.208798 4565 generic.go:334] "Generic (PLEG): container finished" podID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerID="e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a" exitCode=0 Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.208826 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" event={"ID":"29e91702-afc2-470f-a3b9-9be851b01f9c","Type":"ContainerDied","Data":"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a"} Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.208849 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" event={"ID":"29e91702-afc2-470f-a3b9-9be851b01f9c","Type":"ContainerDied","Data":"4cde4549eefc859b20747c28cdf16f857994d8c09c9c4defb0583c8061000777"} Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.208864 4565 scope.go:117] "RemoveContainer" containerID="e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.208862 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nzk24" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.216090 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.216140 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw" (OuterVolumeSpecName: "kube-api-access-w5gvw") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "kube-api-access-w5gvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.216144 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "29e91702-afc2-470f-a3b9-9be851b01f9c" (UID: "29e91702-afc2-470f-a3b9-9be851b01f9c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.235759 4565 scope.go:117] "RemoveContainer" containerID="e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a" Nov 25 09:07:57 crc kubenswrapper[4565]: E1125 09:07:57.236007 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a\": container with ID starting with e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a not found: ID does not exist" containerID="e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.236032 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a"} err="failed to get container status \"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a\": rpc error: code = NotFound desc = could not find container \"e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a\": container with ID starting with e849b5f659d32513a2059d4874784291a37829cc5d84c7ec44960fbdd2d2139a not found: ID does not exist" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299863 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5gvw\" (UniqueName: \"kubernetes.io/projected/29e91702-afc2-470f-a3b9-9be851b01f9c-kube-api-access-w5gvw\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299889 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299899 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299910 4565 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299920 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299943 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299954 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299963 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299971 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299979 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.299989 4565 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29e91702-afc2-470f-a3b9-9be851b01f9c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.526599 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:07:57 crc kubenswrapper[4565]: I1125 09:07:57.528215 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nzk24"] Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.413871 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8"] Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414050 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414062 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414070 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414077 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414084 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414090 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414098 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414104 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414111 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414117 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414124 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414129 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414135 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414140 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414149 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerName="oauth-openshift" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414154 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerName="oauth-openshift" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414160 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433060c6-c174-4a38-9a33-224b8abcab6a" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414165 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="433060c6-c174-4a38-9a33-224b8abcab6a" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414170 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414175 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414183 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414188 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414197 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414202 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414207 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414212 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414220 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414225 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="extract-content" Nov 25 09:07:58 crc kubenswrapper[4565]: E1125 09:07:58.414233 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414238 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="extract-utilities" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414307 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bed9dd-6e26-4b6f-82db-ddf5ed2910b1" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414316 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cda59-b195-4f26-9b05-156ac97681d5" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414325 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c07298-4434-42c5-ab63-fe196405db28" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414333 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" containerName="oauth-openshift" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414354 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="433060c6-c174-4a38-9a33-224b8abcab6a" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414362 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d44c664-75e0-4020-a8f6-8bd5ac874798" containerName="registry-server" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414370 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3b34b3-7f9e-481d-83eb-8fdd094d5ecb" containerName="pruner" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.414679 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.417247 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.417548 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.417651 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.419526 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.419839 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.419986 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.420071 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.420116 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.419987 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.420094 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.420275 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.423373 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.427850 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.432027 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.433328 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.437858 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8"] Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612528 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-error\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612712 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-login\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612729 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612747 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/23c30404-18bb-4f19-ac11-095057ecc69a-audit-dir\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612767 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-service-ca\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612782 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612797 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612811 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-audit-policies\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612832 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612848 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9sg4\" (UniqueName: \"kubernetes.io/projected/23c30404-18bb-4f19-ac11-095057ecc69a-kube-api-access-z9sg4\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612870 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-router-certs\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612910 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-session\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612924 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.612994 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714193 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-error\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714222 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-login\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714240 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714256 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/23c30404-18bb-4f19-ac11-095057ecc69a-audit-dir\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714276 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-service-ca\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714291 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714306 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714320 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-audit-policies\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714350 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714369 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9sg4\" (UniqueName: \"kubernetes.io/projected/23c30404-18bb-4f19-ac11-095057ecc69a-kube-api-access-z9sg4\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714401 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-router-certs\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714420 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-session\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714435 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.714452 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.715650 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/23c30404-18bb-4f19-ac11-095057ecc69a-audit-dir\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.716417 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-service-ca\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.716429 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.716544 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.716357 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/23c30404-18bb-4f19-ac11-095057ecc69a-audit-policies\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.718867 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.719074 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.719609 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-session\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.719724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.722887 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-error\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.723101 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-system-router-certs\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.723133 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-template-login\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.725729 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/23c30404-18bb-4f19-ac11-095057ecc69a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:58 crc kubenswrapper[4565]: I1125 09:07:58.730230 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9sg4\" (UniqueName: \"kubernetes.io/projected/23c30404-18bb-4f19-ac11-095057ecc69a-kube-api-access-z9sg4\") pod \"oauth-openshift-77bfbb8d5b-pzvz8\" (UID: \"23c30404-18bb-4f19-ac11-095057ecc69a\") " pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:59 crc kubenswrapper[4565]: I1125 09:07:59.026141 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:07:59 crc kubenswrapper[4565]: I1125 09:07:59.102900 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e91702-afc2-470f-a3b9-9be851b01f9c" path="/var/lib/kubelet/pods/29e91702-afc2-470f-a3b9-9be851b01f9c/volumes" Nov 25 09:07:59 crc kubenswrapper[4565]: I1125 09:07:59.348173 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8"] Nov 25 09:07:59 crc kubenswrapper[4565]: W1125 09:07:59.354847 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c30404_18bb_4f19_ac11_095057ecc69a.slice/crio-fcb31233f2f72b6d551797cb5ebc4717a40f2995d3fb27fd6ce5106a19c12c62 WatchSource:0}: Error finding container fcb31233f2f72b6d551797cb5ebc4717a40f2995d3fb27fd6ce5106a19c12c62: Status 404 returned error can't find the container with id fcb31233f2f72b6d551797cb5ebc4717a40f2995d3fb27fd6ce5106a19c12c62 Nov 25 09:08:00 crc kubenswrapper[4565]: I1125 09:08:00.221638 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" event={"ID":"23c30404-18bb-4f19-ac11-095057ecc69a","Type":"ContainerStarted","Data":"fdb6759c677913e706ce01836c3e150ab69004bce8c666e9631aa28c031d746e"} Nov 25 09:08:00 crc kubenswrapper[4565]: I1125 09:08:00.221839 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" event={"ID":"23c30404-18bb-4f19-ac11-095057ecc69a","Type":"ContainerStarted","Data":"fcb31233f2f72b6d551797cb5ebc4717a40f2995d3fb27fd6ce5106a19c12c62"} Nov 25 09:08:00 crc kubenswrapper[4565]: I1125 09:08:00.221855 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:08:00 crc kubenswrapper[4565]: I1125 09:08:00.226220 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" Nov 25 09:08:00 crc kubenswrapper[4565]: I1125 09:08:00.235529 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-77bfbb8d5b-pzvz8" podStartSLOduration=29.235514873 podStartE2EDuration="29.235514873s" podCreationTimestamp="2025-11-25 09:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:08:00.233801747 +0000 UTC m=+213.436296885" watchObservedRunningTime="2025-11-25 09:08:00.235514873 +0000 UTC m=+213.438010011" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.298663 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.299172 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9bxs" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="registry-server" containerID="cri-o://f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af" gracePeriod=30 Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.306688 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.306866 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d66wt" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="registry-server" containerID="cri-o://e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06" gracePeriod=30 Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.316370 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.316523 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" containerID="cri-o://47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844" gracePeriod=30 Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.321308 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.321498 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hxkk7" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="registry-server" containerID="cri-o://ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998" gracePeriod=30 Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.331813 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rgcml"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.332311 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.339672 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.339811 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j4tjs" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="registry-server" containerID="cri-o://ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09" gracePeriod=30 Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.349596 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rgcml"] Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.489069 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.489116 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.489154 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfn8\" (UniqueName: \"kubernetes.io/projected/5957e9ea-c2fe-43cb-9318-e22ae96c689c-kube-api-access-bjfn8\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.590549 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.590599 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.590630 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfn8\" (UniqueName: \"kubernetes.io/projected/5957e9ea-c2fe-43cb-9318-e22ae96c689c-kube-api-access-bjfn8\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.593056 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.597670 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5957e9ea-c2fe-43cb-9318-e22ae96c689c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.610450 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfn8\" (UniqueName: \"kubernetes.io/projected/5957e9ea-c2fe-43cb-9318-e22ae96c689c-kube-api-access-bjfn8\") pod \"marketplace-operator-79b997595-rgcml\" (UID: \"5957e9ea-c2fe-43cb-9318-e22ae96c689c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.636360 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.733003 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.742737 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.750519 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.756773 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.796101 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8kzl\" (UniqueName: \"kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl\") pod \"91f8c44a-7b95-4212-8976-753251e9959b\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.796160 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca\") pod \"91f8c44a-7b95-4212-8976-753251e9959b\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.796185 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics\") pod \"91f8c44a-7b95-4212-8976-753251e9959b\" (UID: \"91f8c44a-7b95-4212-8976-753251e9959b\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.798804 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl" (OuterVolumeSpecName: "kube-api-access-l8kzl") pod "91f8c44a-7b95-4212-8976-753251e9959b" (UID: "91f8c44a-7b95-4212-8976-753251e9959b"). InnerVolumeSpecName "kube-api-access-l8kzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.800642 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "91f8c44a-7b95-4212-8976-753251e9959b" (UID: "91f8c44a-7b95-4212-8976-753251e9959b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.810643 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "91f8c44a-7b95-4212-8976-753251e9959b" (UID: "91f8c44a-7b95-4212-8976-753251e9959b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.885397 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897359 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities\") pod \"4191dbab-fe34-482b-a895-adef8dc1c4a0\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897402 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content\") pod \"63ed490a-3e8d-485a-80d9-8c482c9172a1\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897425 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content\") pod \"3f41ac7d-f98d-4d50-8346-fc730f41309c\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897448 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlxjc\" (UniqueName: \"kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc\") pod \"5846c394-bccc-4d13-8cea-d3deb171c550\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897505 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities\") pod \"3f41ac7d-f98d-4d50-8346-fc730f41309c\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897526 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content\") pod \"4191dbab-fe34-482b-a895-adef8dc1c4a0\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897550 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnwx\" (UniqueName: \"kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx\") pod \"4191dbab-fe34-482b-a895-adef8dc1c4a0\" (UID: \"4191dbab-fe34-482b-a895-adef8dc1c4a0\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897576 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcfpd\" (UniqueName: \"kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd\") pod \"63ed490a-3e8d-485a-80d9-8c482c9172a1\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897592 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities\") pod \"63ed490a-3e8d-485a-80d9-8c482c9172a1\" (UID: \"63ed490a-3e8d-485a-80d9-8c482c9172a1\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897605 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content\") pod \"5846c394-bccc-4d13-8cea-d3deb171c550\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897621 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmfz\" (UniqueName: \"kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz\") pod \"3f41ac7d-f98d-4d50-8346-fc730f41309c\" (UID: \"3f41ac7d-f98d-4d50-8346-fc730f41309c\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897649 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities\") pod \"5846c394-bccc-4d13-8cea-d3deb171c550\" (UID: \"5846c394-bccc-4d13-8cea-d3deb171c550\") " Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897811 4565 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897821 4565 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/91f8c44a-7b95-4212-8976-753251e9959b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.897830 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8kzl\" (UniqueName: \"kubernetes.io/projected/91f8c44a-7b95-4212-8976-753251e9959b-kube-api-access-l8kzl\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.898005 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities" (OuterVolumeSpecName: "utilities") pod "4191dbab-fe34-482b-a895-adef8dc1c4a0" (UID: "4191dbab-fe34-482b-a895-adef8dc1c4a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.898309 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities" (OuterVolumeSpecName: "utilities") pod "5846c394-bccc-4d13-8cea-d3deb171c550" (UID: "5846c394-bccc-4d13-8cea-d3deb171c550"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.901544 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc" (OuterVolumeSpecName: "kube-api-access-wlxjc") pod "5846c394-bccc-4d13-8cea-d3deb171c550" (UID: "5846c394-bccc-4d13-8cea-d3deb171c550"). InnerVolumeSpecName "kube-api-access-wlxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.901596 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd" (OuterVolumeSpecName: "kube-api-access-mcfpd") pod "63ed490a-3e8d-485a-80d9-8c482c9172a1" (UID: "63ed490a-3e8d-485a-80d9-8c482c9172a1"). InnerVolumeSpecName "kube-api-access-mcfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.902301 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities" (OuterVolumeSpecName: "utilities") pod "63ed490a-3e8d-485a-80d9-8c482c9172a1" (UID: "63ed490a-3e8d-485a-80d9-8c482c9172a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.903200 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz" (OuterVolumeSpecName: "kube-api-access-kzmfz") pod "3f41ac7d-f98d-4d50-8346-fc730f41309c" (UID: "3f41ac7d-f98d-4d50-8346-fc730f41309c"). InnerVolumeSpecName "kube-api-access-kzmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.905076 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx" (OuterVolumeSpecName: "kube-api-access-qfnwx") pod "4191dbab-fe34-482b-a895-adef8dc1c4a0" (UID: "4191dbab-fe34-482b-a895-adef8dc1c4a0"). InnerVolumeSpecName "kube-api-access-qfnwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.908397 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities" (OuterVolumeSpecName: "utilities") pod "3f41ac7d-f98d-4d50-8346-fc730f41309c" (UID: "3f41ac7d-f98d-4d50-8346-fc730f41309c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.925634 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5846c394-bccc-4d13-8cea-d3deb171c550" (UID: "5846c394-bccc-4d13-8cea-d3deb171c550"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.965572 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f41ac7d-f98d-4d50-8346-fc730f41309c" (UID: "3f41ac7d-f98d-4d50-8346-fc730f41309c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:17 crc kubenswrapper[4565]: I1125 09:08:17.966873 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63ed490a-3e8d-485a-80d9-8c482c9172a1" (UID: "63ed490a-3e8d-485a-80d9-8c482c9172a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000592 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000636 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnwx\" (UniqueName: \"kubernetes.io/projected/4191dbab-fe34-482b-a895-adef8dc1c4a0-kube-api-access-qfnwx\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000671 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcfpd\" (UniqueName: \"kubernetes.io/projected/63ed490a-3e8d-485a-80d9-8c482c9172a1-kube-api-access-mcfpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000679 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000688 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000696 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzmfz\" (UniqueName: \"kubernetes.io/projected/3f41ac7d-f98d-4d50-8346-fc730f41309c-kube-api-access-kzmfz\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000704 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5846c394-bccc-4d13-8cea-d3deb171c550-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000711 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000718 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ed490a-3e8d-485a-80d9-8c482c9172a1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000725 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f41ac7d-f98d-4d50-8346-fc730f41309c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.000732 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlxjc\" (UniqueName: \"kubernetes.io/projected/5846c394-bccc-4d13-8cea-d3deb171c550-kube-api-access-wlxjc\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.023099 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4191dbab-fe34-482b-a895-adef8dc1c4a0" (UID: "4191dbab-fe34-482b-a895-adef8dc1c4a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.101588 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4191dbab-fe34-482b-a895-adef8dc1c4a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.226378 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rgcml"] Nov 25 09:08:18 crc kubenswrapper[4565]: W1125 09:08:18.228568 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5957e9ea_c2fe_43cb_9318_e22ae96c689c.slice/crio-335eb1dd5ffc60f47f2c17257a6fa8b93af6409b05fe4bd0f279f818f3bcd787 WatchSource:0}: Error finding container 335eb1dd5ffc60f47f2c17257a6fa8b93af6409b05fe4bd0f279f818f3bcd787: Status 404 returned error can't find the container with id 335eb1dd5ffc60f47f2c17257a6fa8b93af6409b05fe4bd0f279f818f3bcd787 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.284154 4565 generic.go:334] "Generic (PLEG): container finished" podID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerID="f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af" exitCode=0 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.284217 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9bxs" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.284240 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerDied","Data":"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.284274 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9bxs" event={"ID":"3f41ac7d-f98d-4d50-8346-fc730f41309c","Type":"ContainerDied","Data":"bf6e1bcbc093040ed177f728f6baec304019320481496e9735ecc747e4c55d7d"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.284291 4565 scope.go:117] "RemoveContainer" containerID="f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.286441 4565 generic.go:334] "Generic (PLEG): container finished" podID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerID="e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06" exitCode=0 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.286635 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d66wt" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.287108 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerDied","Data":"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.287148 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d66wt" event={"ID":"63ed490a-3e8d-485a-80d9-8c482c9172a1","Type":"ContainerDied","Data":"65d770105ee6c5fcf476e1da3b7ba8a2185db25ec6649c8952e19c920cf5b288"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.289464 4565 generic.go:334] "Generic (PLEG): container finished" podID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerID="ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09" exitCode=0 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.289537 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerDied","Data":"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.289562 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4tjs" event={"ID":"4191dbab-fe34-482b-a895-adef8dc1c4a0","Type":"ContainerDied","Data":"9e5b56fff6e524773fff6688c2556ca4b5e149229b582c4c8e37e68ac6cef825"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.289576 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4tjs" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.291894 4565 generic.go:334] "Generic (PLEG): container finished" podID="5846c394-bccc-4d13-8cea-d3deb171c550" containerID="ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998" exitCode=0 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.291963 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerDied","Data":"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.291968 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxkk7" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.291983 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxkk7" event={"ID":"5846c394-bccc-4d13-8cea-d3deb171c550","Type":"ContainerDied","Data":"4604a7c85adbaa945b400d5cf6f7de985ee5af30c28f46d45f1f10c8926e9543"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.293222 4565 generic.go:334] "Generic (PLEG): container finished" podID="91f8c44a-7b95-4212-8976-753251e9959b" containerID="47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844" exitCode=0 Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.293262 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" event={"ID":"91f8c44a-7b95-4212-8976-753251e9959b","Type":"ContainerDied","Data":"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.293463 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" event={"ID":"91f8c44a-7b95-4212-8976-753251e9959b","Type":"ContainerDied","Data":"5e6c3fb2fa614ef47a443e9b9b79e62027e238b7f9774356c60c8993b07b4982"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.293293 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f7mzc" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.294131 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" event={"ID":"5957e9ea-c2fe-43cb-9318-e22ae96c689c","Type":"ContainerStarted","Data":"335eb1dd5ffc60f47f2c17257a6fa8b93af6409b05fe4bd0f279f818f3bcd787"} Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.300730 4565 scope.go:117] "RemoveContainer" containerID="2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.314041 4565 scope.go:117] "RemoveContainer" containerID="0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.324635 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.328548 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9bxs"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.343997 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.349290 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j4tjs"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.351670 4565 scope.go:117] "RemoveContainer" containerID="f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.352298 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af\": container with ID starting with f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af not found: ID does not exist" containerID="f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.352399 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af"} err="failed to get container status \"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af\": rpc error: code = NotFound desc = could not find container \"f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af\": container with ID starting with f0d7c9ebfa8ffcd3860431f79b69809bd2fa44a97bb60cc5a4012f2ff7c9a0af not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.352493 4565 scope.go:117] "RemoveContainer" containerID="2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.358091 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56\": container with ID starting with 2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56 not found: ID does not exist" containerID="2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.358127 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56"} err="failed to get container status \"2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56\": rpc error: code = NotFound desc = could not find container \"2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56\": container with ID starting with 2ae5d645bcd23d2f15ce3f0ce98c779b6f5b8f6f72e32884d602e2a8bb503b56 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.358151 4565 scope.go:117] "RemoveContainer" containerID="0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.358443 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0\": container with ID starting with 0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0 not found: ID does not exist" containerID="0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.358484 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0"} err="failed to get container status \"0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0\": rpc error: code = NotFound desc = could not find container \"0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0\": container with ID starting with 0355a4807f8e34365e855245b50872a3ca97faca7b8c738a349d1c2cb72324b0 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.358505 4565 scope.go:117] "RemoveContainer" containerID="e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.360390 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.364855 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxkk7"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.369045 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.373139 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d66wt"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.378051 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.379445 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f7mzc"] Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.379627 4565 scope.go:117] "RemoveContainer" containerID="58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.395667 4565 scope.go:117] "RemoveContainer" containerID="4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.404396 4565 scope.go:117] "RemoveContainer" containerID="e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.404775 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06\": container with ID starting with e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06 not found: ID does not exist" containerID="e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.404861 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06"} err="failed to get container status \"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06\": rpc error: code = NotFound desc = could not find container \"e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06\": container with ID starting with e8fb1d87af8406f1d219900083e170e6928e0463c0a113a28a2323781bb18b06 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.405040 4565 scope.go:117] "RemoveContainer" containerID="58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.405385 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2\": container with ID starting with 58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2 not found: ID does not exist" containerID="58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.405417 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2"} err="failed to get container status \"58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2\": rpc error: code = NotFound desc = could not find container \"58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2\": container with ID starting with 58ef678de9d92e1d735880bdfe5bfb1de8222126c364cabe07f849b9285793a2 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.405441 4565 scope.go:117] "RemoveContainer" containerID="4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.405665 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28\": container with ID starting with 4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28 not found: ID does not exist" containerID="4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.405692 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28"} err="failed to get container status \"4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28\": rpc error: code = NotFound desc = could not find container \"4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28\": container with ID starting with 4e3504a2876dc3b611324bc0de7a73a51b5e180b846042172cfd79f12b87aa28 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.405711 4565 scope.go:117] "RemoveContainer" containerID="ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.416361 4565 scope.go:117] "RemoveContainer" containerID="3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.468318 4565 scope.go:117] "RemoveContainer" containerID="db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.478859 4565 scope.go:117] "RemoveContainer" containerID="ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.479180 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09\": container with ID starting with ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09 not found: ID does not exist" containerID="ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.479263 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09"} err="failed to get container status \"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09\": rpc error: code = NotFound desc = could not find container \"ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09\": container with ID starting with ccd7bdcf29bddffec4f4c1f7aa5191af3fed4976fb44b358b4eb6fc617ce7a09 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.479334 4565 scope.go:117] "RemoveContainer" containerID="3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.479593 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4\": container with ID starting with 3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4 not found: ID does not exist" containerID="3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.479699 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4"} err="failed to get container status \"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4\": rpc error: code = NotFound desc = could not find container \"3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4\": container with ID starting with 3baf24e5a133ae546c1e7ea3045c27fe13aee43be2fbff81487968a82e433ae4 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.479769 4565 scope.go:117] "RemoveContainer" containerID="db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.480022 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f\": container with ID starting with db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f not found: ID does not exist" containerID="db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.480104 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f"} err="failed to get container status \"db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f\": rpc error: code = NotFound desc = could not find container \"db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f\": container with ID starting with db3e75830fe2816876f0ecca7c6317d5a4fef338d38e6e3636454dcf71e5f85f not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.480166 4565 scope.go:117] "RemoveContainer" containerID="ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.489309 4565 scope.go:117] "RemoveContainer" containerID="acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.499680 4565 scope.go:117] "RemoveContainer" containerID="f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.510535 4565 scope.go:117] "RemoveContainer" containerID="ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.510814 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998\": container with ID starting with ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998 not found: ID does not exist" containerID="ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.510844 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998"} err="failed to get container status \"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998\": rpc error: code = NotFound desc = could not find container \"ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998\": container with ID starting with ab50a96570a551cca614783ff281bc14931bc7482fcf96177803e3beb713b998 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.510864 4565 scope.go:117] "RemoveContainer" containerID="acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.511243 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048\": container with ID starting with acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048 not found: ID does not exist" containerID="acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.511275 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048"} err="failed to get container status \"acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048\": rpc error: code = NotFound desc = could not find container \"acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048\": container with ID starting with acd4e0b42bd342f7f2d116a666181d898211a4cce9a5361c1d4de7ec93845048 not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.511296 4565 scope.go:117] "RemoveContainer" containerID="f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.513098 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f\": container with ID starting with f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f not found: ID does not exist" containerID="f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.513123 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f"} err="failed to get container status \"f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f\": rpc error: code = NotFound desc = could not find container \"f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f\": container with ID starting with f3c584d70f010838aa77c6345225588c0b05e6035d50cc76333603f9830cea0f not found: ID does not exist" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.513139 4565 scope.go:117] "RemoveContainer" containerID="47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.523845 4565 scope.go:117] "RemoveContainer" containerID="47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844" Nov 25 09:08:18 crc kubenswrapper[4565]: E1125 09:08:18.524174 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844\": container with ID starting with 47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844 not found: ID does not exist" containerID="47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844" Nov 25 09:08:18 crc kubenswrapper[4565]: I1125 09:08:18.524207 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844"} err="failed to get container status \"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844\": rpc error: code = NotFound desc = could not find container \"47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844\": container with ID starting with 47d0294b880dc0af156a2348ea192aadd1be95454983e96e4ccb6d7bfbc35844 not found: ID does not exist" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.103792 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" path="/var/lib/kubelet/pods/3f41ac7d-f98d-4d50-8346-fc730f41309c/volumes" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.104349 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" path="/var/lib/kubelet/pods/4191dbab-fe34-482b-a895-adef8dc1c4a0/volumes" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.104861 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" path="/var/lib/kubelet/pods/5846c394-bccc-4d13-8cea-d3deb171c550/volumes" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.105409 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" path="/var/lib/kubelet/pods/63ed490a-3e8d-485a-80d9-8c482c9172a1/volumes" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.106582 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f8c44a-7b95-4212-8976-753251e9959b" path="/var/lib/kubelet/pods/91f8c44a-7b95-4212-8976-753251e9959b/volumes" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.302760 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" event={"ID":"5957e9ea-c2fe-43cb-9318-e22ae96c689c","Type":"ContainerStarted","Data":"ad29b16ac58d395ef0023e998038476222bb2c3909808a7ba6892f58c1e5f1c1"} Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.318496 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" podStartSLOduration=2.318470085 podStartE2EDuration="2.318470085s" podCreationTimestamp="2025-11-25 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:08:19.315331206 +0000 UTC m=+232.517826345" watchObservedRunningTime="2025-11-25 09:08:19.318470085 +0000 UTC m=+232.520965223" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511024 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnphj"] Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511227 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511239 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511250 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511256 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511263 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511268 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511274 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511280 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511287 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511293 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511301 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511306 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511314 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511320 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511325 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511332 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511338 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511343 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511350 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511355 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="extract-utilities" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511362 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511367 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511375 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511379 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: E1125 09:08:19.511385 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511390 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="extract-content" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511467 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f41ac7d-f98d-4d50-8346-fc730f41309c" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511476 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ed490a-3e8d-485a-80d9-8c482c9172a1" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511504 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4191dbab-fe34-482b-a895-adef8dc1c4a0" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511509 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5846c394-bccc-4d13-8cea-d3deb171c550" containerName="registry-server" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.511516 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f8c44a-7b95-4212-8976-753251e9959b" containerName="marketplace-operator" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.512172 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.515395 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.519231 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnphj"] Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.620121 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4mwr\" (UniqueName: \"kubernetes.io/projected/849cf673-c6d9-4372-a23d-96e04c71a796-kube-api-access-q4mwr\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.620173 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-catalog-content\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.620203 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-utilities\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.709577 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wrgj8"] Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.710524 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.712285 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.721330 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4mwr\" (UniqueName: \"kubernetes.io/projected/849cf673-c6d9-4372-a23d-96e04c71a796-kube-api-access-q4mwr\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.721380 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-catalog-content\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.721416 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-utilities\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.721795 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-catalog-content\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.721947 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/849cf673-c6d9-4372-a23d-96e04c71a796-utilities\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.724186 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrgj8"] Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.740114 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4mwr\" (UniqueName: \"kubernetes.io/projected/849cf673-c6d9-4372-a23d-96e04c71a796-kube-api-access-q4mwr\") pod \"redhat-marketplace-mnphj\" (UID: \"849cf673-c6d9-4372-a23d-96e04c71a796\") " pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.825723 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-utilities\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.825759 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-catalog-content\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.825792 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kms\" (UniqueName: \"kubernetes.io/projected/45e414d4-5edb-48e0-a419-2d00347fda7c-kube-api-access-b8kms\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.832771 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.926939 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-utilities\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.927000 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-catalog-content\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.927053 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kms\" (UniqueName: \"kubernetes.io/projected/45e414d4-5edb-48e0-a419-2d00347fda7c-kube-api-access-b8kms\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.927875 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-utilities\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.927972 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e414d4-5edb-48e0-a419-2d00347fda7c-catalog-content\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:19 crc kubenswrapper[4565]: I1125 09:08:19.939836 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kms\" (UniqueName: \"kubernetes.io/projected/45e414d4-5edb-48e0-a419-2d00347fda7c-kube-api-access-b8kms\") pod \"certified-operators-wrgj8\" (UID: \"45e414d4-5edb-48e0-a419-2d00347fda7c\") " pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.028557 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.167692 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnphj"] Nov 25 09:08:20 crc kubenswrapper[4565]: W1125 09:08:20.172313 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849cf673_c6d9_4372_a23d_96e04c71a796.slice/crio-64da9684028d1082c5c65f288126a4a80ad40f65dc8a0407c69df21d0d052ca3 WatchSource:0}: Error finding container 64da9684028d1082c5c65f288126a4a80ad40f65dc8a0407c69df21d0d052ca3: Status 404 returned error can't find the container with id 64da9684028d1082c5c65f288126a4a80ad40f65dc8a0407c69df21d0d052ca3 Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.311711 4565 generic.go:334] "Generic (PLEG): container finished" podID="849cf673-c6d9-4372-a23d-96e04c71a796" containerID="f8393021913950bbc04f7685b9a93ec696a108ca3604bb2c1e047c11ff47f023" exitCode=0 Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.311782 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnphj" event={"ID":"849cf673-c6d9-4372-a23d-96e04c71a796","Type":"ContainerDied","Data":"f8393021913950bbc04f7685b9a93ec696a108ca3604bb2c1e047c11ff47f023"} Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.311875 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnphj" event={"ID":"849cf673-c6d9-4372-a23d-96e04c71a796","Type":"ContainerStarted","Data":"64da9684028d1082c5c65f288126a4a80ad40f65dc8a0407c69df21d0d052ca3"} Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.311959 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.313577 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rgcml" Nov 25 09:08:20 crc kubenswrapper[4565]: I1125 09:08:20.354708 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrgj8"] Nov 25 09:08:20 crc kubenswrapper[4565]: W1125 09:08:20.363767 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e414d4_5edb_48e0_a419_2d00347fda7c.slice/crio-f312313c9a09c744e641549996f547219c5e4a1bea410a9a584411151f6ef06b WatchSource:0}: Error finding container f312313c9a09c744e641549996f547219c5e4a1bea410a9a584411151f6ef06b: Status 404 returned error can't find the container with id f312313c9a09c744e641549996f547219c5e4a1bea410a9a584411151f6ef06b Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.316556 4565 generic.go:334] "Generic (PLEG): container finished" podID="45e414d4-5edb-48e0-a419-2d00347fda7c" containerID="16f816132d0e82245e9d4c87d8515df47956b898bcb8dbfb1f5374a38717f270" exitCode=0 Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.316647 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrgj8" event={"ID":"45e414d4-5edb-48e0-a419-2d00347fda7c","Type":"ContainerDied","Data":"16f816132d0e82245e9d4c87d8515df47956b898bcb8dbfb1f5374a38717f270"} Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.316807 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrgj8" event={"ID":"45e414d4-5edb-48e0-a419-2d00347fda7c","Type":"ContainerStarted","Data":"f312313c9a09c744e641549996f547219c5e4a1bea410a9a584411151f6ef06b"} Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.318862 4565 generic.go:334] "Generic (PLEG): container finished" podID="849cf673-c6d9-4372-a23d-96e04c71a796" containerID="523f6ac4b262aa880f9a594215aedb6e81aa81141e7d2e9b17f5d89b1ee28bb4" exitCode=0 Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.318900 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnphj" event={"ID":"849cf673-c6d9-4372-a23d-96e04c71a796","Type":"ContainerDied","Data":"523f6ac4b262aa880f9a594215aedb6e81aa81141e7d2e9b17f5d89b1ee28bb4"} Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.916775 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pwhqj"] Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.917776 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.919088 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 09:08:21 crc kubenswrapper[4565]: I1125 09:08:21.925936 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwhqj"] Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.054822 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-utilities\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.054871 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwxf\" (UniqueName: \"kubernetes.io/projected/23308127-bd19-4001-80f1-9cc93c692984-kube-api-access-mjwxf\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.055265 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-catalog-content\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.112238 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dq959"] Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.113062 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.114816 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.122340 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dq959"] Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.159137 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-utilities\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.159182 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwxf\" (UniqueName: \"kubernetes.io/projected/23308127-bd19-4001-80f1-9cc93c692984-kube-api-access-mjwxf\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.159202 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-catalog-content\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.159593 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-utilities\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.159601 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23308127-bd19-4001-80f1-9cc93c692984-catalog-content\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.174192 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwxf\" (UniqueName: \"kubernetes.io/projected/23308127-bd19-4001-80f1-9cc93c692984-kube-api-access-mjwxf\") pod \"redhat-operators-pwhqj\" (UID: \"23308127-bd19-4001-80f1-9cc93c692984\") " pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.229709 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.260071 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7v9b\" (UniqueName: \"kubernetes.io/projected/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-kube-api-access-n7v9b\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.260127 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-utilities\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.260146 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-catalog-content\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.329377 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnphj" event={"ID":"849cf673-c6d9-4372-a23d-96e04c71a796","Type":"ContainerStarted","Data":"5673a6261e352ed49e875a0a2fb9a63634642380c317135995dce84bac4bf8d4"} Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.332525 4565 generic.go:334] "Generic (PLEG): container finished" podID="45e414d4-5edb-48e0-a419-2d00347fda7c" containerID="625c73992cf5d5d24931cc8611dcef12f14ffaf24296835f99d53b8f5cc5d89c" exitCode=0 Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.333002 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrgj8" event={"ID":"45e414d4-5edb-48e0-a419-2d00347fda7c","Type":"ContainerDied","Data":"625c73992cf5d5d24931cc8611dcef12f14ffaf24296835f99d53b8f5cc5d89c"} Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.357486 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnphj" podStartSLOduration=1.89191266 podStartE2EDuration="3.357471666s" podCreationTimestamp="2025-11-25 09:08:19 +0000 UTC" firstStartedPulling="2025-11-25 09:08:20.31509296 +0000 UTC m=+233.517588097" lastFinishedPulling="2025-11-25 09:08:21.780651965 +0000 UTC m=+234.983147103" observedRunningTime="2025-11-25 09:08:22.345686836 +0000 UTC m=+235.548181965" watchObservedRunningTime="2025-11-25 09:08:22.357471666 +0000 UTC m=+235.559966805" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.363428 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-catalog-content\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.363454 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-utilities\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.363524 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7v9b\" (UniqueName: \"kubernetes.io/projected/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-kube-api-access-n7v9b\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.363876 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-catalog-content\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.363919 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-utilities\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.377972 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7v9b\" (UniqueName: \"kubernetes.io/projected/37044fab-4a08-4d4f-a2d4-9da1a0eb5127-kube-api-access-n7v9b\") pod \"community-operators-dq959\" (UID: \"37044fab-4a08-4d4f-a2d4-9da1a0eb5127\") " pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.424987 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.572713 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwhqj"] Nov 25 09:08:22 crc kubenswrapper[4565]: W1125 09:08:22.581159 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23308127_bd19_4001_80f1_9cc93c692984.slice/crio-33f3ea91e0bdc3e647b95764b7e68ee93dd5bce993a71ab9434b5765fd0d118c WatchSource:0}: Error finding container 33f3ea91e0bdc3e647b95764b7e68ee93dd5bce993a71ab9434b5765fd0d118c: Status 404 returned error can't find the container with id 33f3ea91e0bdc3e647b95764b7e68ee93dd5bce993a71ab9434b5765fd0d118c Nov 25 09:08:22 crc kubenswrapper[4565]: I1125 09:08:22.763823 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dq959"] Nov 25 09:08:22 crc kubenswrapper[4565]: W1125 09:08:22.768758 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37044fab_4a08_4d4f_a2d4_9da1a0eb5127.slice/crio-2fbc29788aa6af4cfbd90059053087d54b7cd75109179e39bf1b4bb128363c21 WatchSource:0}: Error finding container 2fbc29788aa6af4cfbd90059053087d54b7cd75109179e39bf1b4bb128363c21: Status 404 returned error can't find the container with id 2fbc29788aa6af4cfbd90059053087d54b7cd75109179e39bf1b4bb128363c21 Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.336966 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrgj8" event={"ID":"45e414d4-5edb-48e0-a419-2d00347fda7c","Type":"ContainerStarted","Data":"dd1774b902805101f33c00e1a61040d6e37c5edbc9b0de87ac832b9407d18953"} Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.339909 4565 generic.go:334] "Generic (PLEG): container finished" podID="37044fab-4a08-4d4f-a2d4-9da1a0eb5127" containerID="0ea54ac0806df117a2440de042d2c5659175408ca7d67bb8b2ecce6f8692656d" exitCode=0 Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.339980 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq959" event={"ID":"37044fab-4a08-4d4f-a2d4-9da1a0eb5127","Type":"ContainerDied","Data":"0ea54ac0806df117a2440de042d2c5659175408ca7d67bb8b2ecce6f8692656d"} Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.340004 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq959" event={"ID":"37044fab-4a08-4d4f-a2d4-9da1a0eb5127","Type":"ContainerStarted","Data":"2fbc29788aa6af4cfbd90059053087d54b7cd75109179e39bf1b4bb128363c21"} Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.347623 4565 generic.go:334] "Generic (PLEG): container finished" podID="23308127-bd19-4001-80f1-9cc93c692984" containerID="eccd5daf9190a6daa9e1e0cba81af834f41f686a39c2d89ddd258b0a0bb1a49b" exitCode=0 Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.347678 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwhqj" event={"ID":"23308127-bd19-4001-80f1-9cc93c692984","Type":"ContainerDied","Data":"eccd5daf9190a6daa9e1e0cba81af834f41f686a39c2d89ddd258b0a0bb1a49b"} Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.347728 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwhqj" event={"ID":"23308127-bd19-4001-80f1-9cc93c692984","Type":"ContainerStarted","Data":"33f3ea91e0bdc3e647b95764b7e68ee93dd5bce993a71ab9434b5765fd0d118c"} Nov 25 09:08:23 crc kubenswrapper[4565]: I1125 09:08:23.354529 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wrgj8" podStartSLOduration=2.806991136 podStartE2EDuration="4.354518441s" podCreationTimestamp="2025-11-25 09:08:19 +0000 UTC" firstStartedPulling="2025-11-25 09:08:21.317841746 +0000 UTC m=+234.520336884" lastFinishedPulling="2025-11-25 09:08:22.865369051 +0000 UTC m=+236.067864189" observedRunningTime="2025-11-25 09:08:23.3518287 +0000 UTC m=+236.554323839" watchObservedRunningTime="2025-11-25 09:08:23.354518441 +0000 UTC m=+236.557013579" Nov 25 09:08:24 crc kubenswrapper[4565]: I1125 09:08:24.352777 4565 generic.go:334] "Generic (PLEG): container finished" podID="37044fab-4a08-4d4f-a2d4-9da1a0eb5127" containerID="fe6c7df9736d20073bf8c41a39a90273d98d69003b3b3ebc2bb08870c55f5e5b" exitCode=0 Nov 25 09:08:24 crc kubenswrapper[4565]: I1125 09:08:24.353010 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq959" event={"ID":"37044fab-4a08-4d4f-a2d4-9da1a0eb5127","Type":"ContainerDied","Data":"fe6c7df9736d20073bf8c41a39a90273d98d69003b3b3ebc2bb08870c55f5e5b"} Nov 25 09:08:25 crc kubenswrapper[4565]: I1125 09:08:25.360247 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwhqj" event={"ID":"23308127-bd19-4001-80f1-9cc93c692984","Type":"ContainerStarted","Data":"160dd3fb4931fdfb33a8aa35c0c60ff1482badb189ec8f09cf517eb328a8f05e"} Nov 25 09:08:25 crc kubenswrapper[4565]: I1125 09:08:25.363089 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq959" event={"ID":"37044fab-4a08-4d4f-a2d4-9da1a0eb5127","Type":"ContainerStarted","Data":"10594da4441cbca503db3efc65fbcd0ad18b09e30c0860588136413c2e7f7949"} Nov 25 09:08:25 crc kubenswrapper[4565]: I1125 09:08:25.390498 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dq959" podStartSLOduration=1.876896267 podStartE2EDuration="3.390480958s" podCreationTimestamp="2025-11-25 09:08:22 +0000 UTC" firstStartedPulling="2025-11-25 09:08:23.341407245 +0000 UTC m=+236.543902384" lastFinishedPulling="2025-11-25 09:08:24.854991936 +0000 UTC m=+238.057487075" observedRunningTime="2025-11-25 09:08:25.389683472 +0000 UTC m=+238.592178610" watchObservedRunningTime="2025-11-25 09:08:25.390480958 +0000 UTC m=+238.592976097" Nov 25 09:08:26 crc kubenswrapper[4565]: I1125 09:08:26.368955 4565 generic.go:334] "Generic (PLEG): container finished" podID="23308127-bd19-4001-80f1-9cc93c692984" containerID="160dd3fb4931fdfb33a8aa35c0c60ff1482badb189ec8f09cf517eb328a8f05e" exitCode=0 Nov 25 09:08:26 crc kubenswrapper[4565]: I1125 09:08:26.369535 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwhqj" event={"ID":"23308127-bd19-4001-80f1-9cc93c692984","Type":"ContainerDied","Data":"160dd3fb4931fdfb33a8aa35c0c60ff1482badb189ec8f09cf517eb328a8f05e"} Nov 25 09:08:27 crc kubenswrapper[4565]: I1125 09:08:27.374216 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwhqj" event={"ID":"23308127-bd19-4001-80f1-9cc93c692984","Type":"ContainerStarted","Data":"70c7575675258677423fc56123bdf3c42d684b55b6efda69f2a8270cc253b0b8"} Nov 25 09:08:27 crc kubenswrapper[4565]: I1125 09:08:27.387867 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pwhqj" podStartSLOduration=2.8668110479999998 podStartE2EDuration="6.387851058s" podCreationTimestamp="2025-11-25 09:08:21 +0000 UTC" firstStartedPulling="2025-11-25 09:08:23.348760926 +0000 UTC m=+236.551256065" lastFinishedPulling="2025-11-25 09:08:26.869800936 +0000 UTC m=+240.072296075" observedRunningTime="2025-11-25 09:08:27.385336979 +0000 UTC m=+240.587832117" watchObservedRunningTime="2025-11-25 09:08:27.387851058 +0000 UTC m=+240.590346196" Nov 25 09:08:29 crc kubenswrapper[4565]: I1125 09:08:29.833974 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:29 crc kubenswrapper[4565]: I1125 09:08:29.834135 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:29 crc kubenswrapper[4565]: I1125 09:08:29.864668 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:30 crc kubenswrapper[4565]: I1125 09:08:30.028788 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:30 crc kubenswrapper[4565]: I1125 09:08:30.028824 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:30 crc kubenswrapper[4565]: I1125 09:08:30.054336 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:30 crc kubenswrapper[4565]: I1125 09:08:30.407484 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnphj" Nov 25 09:08:30 crc kubenswrapper[4565]: I1125 09:08:30.417270 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wrgj8" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.230594 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.230636 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.258923 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.414720 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pwhqj" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.425771 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.426154 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:32 crc kubenswrapper[4565]: I1125 09:08:32.453514 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:08:33 crc kubenswrapper[4565]: I1125 09:08:33.428356 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dq959" Nov 25 09:09:55 crc kubenswrapper[4565]: I1125 09:09:55.099857 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:09:55 crc kubenswrapper[4565]: I1125 09:09:55.100548 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:10:25 crc kubenswrapper[4565]: I1125 09:10:25.099004 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:10:25 crc kubenswrapper[4565]: I1125 09:10:25.099361 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.099048 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.099400 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.101309 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.101675 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.101726 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93" gracePeriod=600 Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.879418 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93" exitCode=0 Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.879507 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93"} Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.879729 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f"} Nov 25 09:10:55 crc kubenswrapper[4565]: I1125 09:10:55.879749 4565 scope.go:117] "RemoveContainer" containerID="b0f35d7105f4f7ed4b023a99ac5b6878e1c205402a2133c7131e341db10af708" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.195960 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4cpm"] Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.196834 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.212809 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4cpm"] Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.385877 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-tls\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-certificates\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386204 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386303 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386351 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8452\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-kube-api-access-t8452\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386398 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-trusted-ca\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386435 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.386459 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.404175 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487118 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8452\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-kube-api-access-t8452\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487173 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-trusted-ca\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487195 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487217 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487234 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-tls\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487252 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-certificates\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487279 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.487735 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.488264 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-trusted-ca\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.488508 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-certificates\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.491464 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.491820 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-registry-tls\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.499995 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.500620 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8452\" (UniqueName: \"kubernetes.io/projected/1e19d23a-1327-4497-bb3d-93fb4a32fd9f-kube-api-access-t8452\") pod \"image-registry-66df7c8f76-k4cpm\" (UID: \"1e19d23a-1327-4497-bb3d-93fb4a32fd9f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.508574 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:33 crc kubenswrapper[4565]: I1125 09:11:33.830775 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4cpm"] Nov 25 09:11:34 crc kubenswrapper[4565]: I1125 09:11:34.024769 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" event={"ID":"1e19d23a-1327-4497-bb3d-93fb4a32fd9f","Type":"ContainerStarted","Data":"8c70732b943f4e4a2616e4cda914490b036221c0ef8869d73deb63c70c1c7275"} Nov 25 09:11:34 crc kubenswrapper[4565]: I1125 09:11:34.024810 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" event={"ID":"1e19d23a-1327-4497-bb3d-93fb4a32fd9f","Type":"ContainerStarted","Data":"c4e4f7b9407397d8f2240589890207dcce771f10dc3dfa7fa9a3982851597d54"} Nov 25 09:11:34 crc kubenswrapper[4565]: I1125 09:11:34.025051 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:34 crc kubenswrapper[4565]: I1125 09:11:34.038821 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" podStartSLOduration=1.038797733 podStartE2EDuration="1.038797733s" podCreationTimestamp="2025-11-25 09:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:11:34.0365684 +0000 UTC m=+427.239063537" watchObservedRunningTime="2025-11-25 09:11:34.038797733 +0000 UTC m=+427.241292872" Nov 25 09:11:53 crc kubenswrapper[4565]: I1125 09:11:53.512499 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-k4cpm" Nov 25 09:11:53 crc kubenswrapper[4565]: I1125 09:11:53.549054 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.575853 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" podUID="755dab00-cc07-483e-82b6-8a3e54e6dee3" containerName="registry" containerID="cri-o://41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243" gracePeriod=30 Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.830515 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939254 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939304 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939345 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnsx\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939491 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939526 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939583 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939605 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.939622 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets\") pod \"755dab00-cc07-483e-82b6-8a3e54e6dee3\" (UID: \"755dab00-cc07-483e-82b6-8a3e54e6dee3\") " Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.940233 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.940776 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.943842 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx" (OuterVolumeSpecName: "kube-api-access-8nnsx") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "kube-api-access-8nnsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.944341 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.944560 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.944770 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.946989 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 09:12:18 crc kubenswrapper[4565]: I1125 09:12:18.952491 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "755dab00-cc07-483e-82b6-8a3e54e6dee3" (UID: "755dab00-cc07-483e-82b6-8a3e54e6dee3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.040788 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.040894 4565 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.040968 4565 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755dab00-cc07-483e-82b6-8a3e54e6dee3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.041020 4565 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755dab00-cc07-483e-82b6-8a3e54e6dee3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.041074 4565 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.041123 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnsx\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-kube-api-access-8nnsx\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.041167 4565 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755dab00-cc07-483e-82b6-8a3e54e6dee3-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.177717 4565 generic.go:334] "Generic (PLEG): container finished" podID="755dab00-cc07-483e-82b6-8a3e54e6dee3" containerID="41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243" exitCode=0 Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.177766 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.177770 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" event={"ID":"755dab00-cc07-483e-82b6-8a3e54e6dee3","Type":"ContainerDied","Data":"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243"} Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.178117 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fljns" event={"ID":"755dab00-cc07-483e-82b6-8a3e54e6dee3","Type":"ContainerDied","Data":"743dd1c94f52feaf1f2e03d5c604327d4d6216f8093a06c862117ea0a7cf1889"} Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.178139 4565 scope.go:117] "RemoveContainer" containerID="41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.193010 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.195816 4565 scope.go:117] "RemoveContainer" containerID="41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243" Nov 25 09:12:19 crc kubenswrapper[4565]: E1125 09:12:19.196152 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243\": container with ID starting with 41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243 not found: ID does not exist" containerID="41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.196238 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243"} err="failed to get container status \"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243\": rpc error: code = NotFound desc = could not find container \"41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243\": container with ID starting with 41949da05fd9037742caa54c029e05ed27287cd560a5bb3305ac051698db9243 not found: ID does not exist" Nov 25 09:12:19 crc kubenswrapper[4565]: I1125 09:12:19.196487 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fljns"] Nov 25 09:12:21 crc kubenswrapper[4565]: I1125 09:12:21.101949 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755dab00-cc07-483e-82b6-8a3e54e6dee3" path="/var/lib/kubelet/pods/755dab00-cc07-483e-82b6-8a3e54e6dee3/volumes" Nov 25 09:12:55 crc kubenswrapper[4565]: I1125 09:12:55.099178 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:12:55 crc kubenswrapper[4565]: I1125 09:12:55.099564 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.873830 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fqzsz"] Nov 25 09:13:07 crc kubenswrapper[4565]: E1125 09:13:07.874407 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755dab00-cc07-483e-82b6-8a3e54e6dee3" containerName="registry" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.874420 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="755dab00-cc07-483e-82b6-8a3e54e6dee3" containerName="registry" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.874495 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="755dab00-cc07-483e-82b6-8a3e54e6dee3" containerName="registry" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.874768 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.876631 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.876833 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6rgj5" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.880311 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.882241 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h4n5f"] Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.882650 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h4n5f" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.886346 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fqzsz"] Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.887920 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-thjvr" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.895370 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h4n5f"] Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.904787 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vpzsg"] Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.905597 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.907874 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jsfd9" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.913311 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmlrz\" (UniqueName: \"kubernetes.io/projected/bac4df7d-2428-4150-881b-5695b1cfbddd-kube-api-access-fmlrz\") pod \"cert-manager-5b446d88c5-h4n5f\" (UID: \"bac4df7d-2428-4150-881b-5695b1cfbddd\") " pod="cert-manager/cert-manager-5b446d88c5-h4n5f" Nov 25 09:13:07 crc kubenswrapper[4565]: I1125 09:13:07.934491 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vpzsg"] Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.014973 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp47\" (UniqueName: \"kubernetes.io/projected/be96e081-d820-40aa-81e9-bff6c2392110-kube-api-access-nkp47\") pod \"cert-manager-webhook-5655c58dd6-vpzsg\" (UID: \"be96e081-d820-40aa-81e9-bff6c2392110\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.015039 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkf7n\" (UniqueName: \"kubernetes.io/projected/9500e97b-07b7-43d8-bfdf-dab609ce7f67-kube-api-access-lkf7n\") pod \"cert-manager-cainjector-7f985d654d-fqzsz\" (UID: \"9500e97b-07b7-43d8-bfdf-dab609ce7f67\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.015197 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmlrz\" (UniqueName: \"kubernetes.io/projected/bac4df7d-2428-4150-881b-5695b1cfbddd-kube-api-access-fmlrz\") pod \"cert-manager-5b446d88c5-h4n5f\" (UID: \"bac4df7d-2428-4150-881b-5695b1cfbddd\") " pod="cert-manager/cert-manager-5b446d88c5-h4n5f" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.032039 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmlrz\" (UniqueName: \"kubernetes.io/projected/bac4df7d-2428-4150-881b-5695b1cfbddd-kube-api-access-fmlrz\") pod \"cert-manager-5b446d88c5-h4n5f\" (UID: \"bac4df7d-2428-4150-881b-5695b1cfbddd\") " pod="cert-manager/cert-manager-5b446d88c5-h4n5f" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.116003 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkf7n\" (UniqueName: \"kubernetes.io/projected/9500e97b-07b7-43d8-bfdf-dab609ce7f67-kube-api-access-lkf7n\") pod \"cert-manager-cainjector-7f985d654d-fqzsz\" (UID: \"9500e97b-07b7-43d8-bfdf-dab609ce7f67\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.116093 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp47\" (UniqueName: \"kubernetes.io/projected/be96e081-d820-40aa-81e9-bff6c2392110-kube-api-access-nkp47\") pod \"cert-manager-webhook-5655c58dd6-vpzsg\" (UID: \"be96e081-d820-40aa-81e9-bff6c2392110\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.132255 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp47\" (UniqueName: \"kubernetes.io/projected/be96e081-d820-40aa-81e9-bff6c2392110-kube-api-access-nkp47\") pod \"cert-manager-webhook-5655c58dd6-vpzsg\" (UID: \"be96e081-d820-40aa-81e9-bff6c2392110\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.132799 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkf7n\" (UniqueName: \"kubernetes.io/projected/9500e97b-07b7-43d8-bfdf-dab609ce7f67-kube-api-access-lkf7n\") pod \"cert-manager-cainjector-7f985d654d-fqzsz\" (UID: \"9500e97b-07b7-43d8-bfdf-dab609ce7f67\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.192989 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.197996 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h4n5f" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.216298 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.577094 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h4n5f"] Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.579772 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fqzsz"] Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.583137 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:13:08 crc kubenswrapper[4565]: I1125 09:13:08.623797 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vpzsg"] Nov 25 09:13:08 crc kubenswrapper[4565]: W1125 09:13:08.628221 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe96e081_d820_40aa_81e9_bff6c2392110.slice/crio-e48e92a5b3a6f41fad92cef0c3c20dced1046465ea2b780fadb82addf39be671 WatchSource:0}: Error finding container e48e92a5b3a6f41fad92cef0c3c20dced1046465ea2b780fadb82addf39be671: Status 404 returned error can't find the container with id e48e92a5b3a6f41fad92cef0c3c20dced1046465ea2b780fadb82addf39be671 Nov 25 09:13:09 crc kubenswrapper[4565]: I1125 09:13:09.355866 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" event={"ID":"9500e97b-07b7-43d8-bfdf-dab609ce7f67","Type":"ContainerStarted","Data":"8a6c5521cb138e3bbfe345483d6c33e4d79baded885c3bc059e397abb35672c3"} Nov 25 09:13:09 crc kubenswrapper[4565]: I1125 09:13:09.357552 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" event={"ID":"be96e081-d820-40aa-81e9-bff6c2392110","Type":"ContainerStarted","Data":"e48e92a5b3a6f41fad92cef0c3c20dced1046465ea2b780fadb82addf39be671"} Nov 25 09:13:09 crc kubenswrapper[4565]: I1125 09:13:09.358521 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h4n5f" event={"ID":"bac4df7d-2428-4150-881b-5695b1cfbddd","Type":"ContainerStarted","Data":"586ecbd1492a9f41ad7185927c1bee2ac5ac27e1e99baa7a01e9321ba849bb51"} Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.370821 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" event={"ID":"9500e97b-07b7-43d8-bfdf-dab609ce7f67","Type":"ContainerStarted","Data":"fb720d22e9c42b195509d2e0fc95ec4303a3abdf6931a014ebdb4c6d1d51dcd4"} Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.372536 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" event={"ID":"be96e081-d820-40aa-81e9-bff6c2392110","Type":"ContainerStarted","Data":"1d6d5148c70fd2675e79f7aa8f141093468151f1a14f84627658b3b477da0ec5"} Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.372606 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.373404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h4n5f" event={"ID":"bac4df7d-2428-4150-881b-5695b1cfbddd","Type":"ContainerStarted","Data":"8494b294f61b6f3dafbde146f5805574838c02071bd138c92b2f7657f7710c75"} Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.383622 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-fqzsz" podStartSLOduration=2.409792185 podStartE2EDuration="5.383611227s" podCreationTimestamp="2025-11-25 09:13:07 +0000 UTC" firstStartedPulling="2025-11-25 09:13:08.584630693 +0000 UTC m=+521.787125831" lastFinishedPulling="2025-11-25 09:13:11.558449734 +0000 UTC m=+524.760944873" observedRunningTime="2025-11-25 09:13:12.380423584 +0000 UTC m=+525.582918722" watchObservedRunningTime="2025-11-25 09:13:12.383611227 +0000 UTC m=+525.586106365" Nov 25 09:13:12 crc kubenswrapper[4565]: I1125 09:13:12.393476 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-h4n5f" podStartSLOduration=2.407530986 podStartE2EDuration="5.393463321s" podCreationTimestamp="2025-11-25 09:13:07 +0000 UTC" firstStartedPulling="2025-11-25 09:13:08.582580698 +0000 UTC m=+521.785075836" lastFinishedPulling="2025-11-25 09:13:11.568513033 +0000 UTC m=+524.771008171" observedRunningTime="2025-11-25 09:13:12.391122243 +0000 UTC m=+525.593617381" watchObservedRunningTime="2025-11-25 09:13:12.393463321 +0000 UTC m=+525.595958459" Nov 25 09:13:18 crc kubenswrapper[4565]: I1125 09:13:18.218955 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" Nov 25 09:13:18 crc kubenswrapper[4565]: I1125 09:13:18.231036 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-vpzsg" podStartSLOduration=8.295563384 podStartE2EDuration="11.231025732s" podCreationTimestamp="2025-11-25 09:13:07 +0000 UTC" firstStartedPulling="2025-11-25 09:13:08.630357839 +0000 UTC m=+521.832852977" lastFinishedPulling="2025-11-25 09:13:11.565820187 +0000 UTC m=+524.768315325" observedRunningTime="2025-11-25 09:13:12.412293218 +0000 UTC m=+525.614788356" watchObservedRunningTime="2025-11-25 09:13:18.231025732 +0000 UTC m=+531.433520870" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164300 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vk74d"] Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164596 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-controller" containerID="cri-o://9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164635 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164662 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-node" containerID="cri-o://90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164680 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-acl-logging" containerID="cri-o://7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164706 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="northd" containerID="cri-o://2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.164792 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="sbdb" containerID="cri-o://30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.167037 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="nbdb" containerID="cri-o://070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.190442 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" containerID="cri-o://446386ecc8985c225115f5f5270aa4e5dc6f01c72c0e5763b832e06920890368" gracePeriod=30 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.401300 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovnkube-controller/3.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.403481 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-acl-logging/0.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.403951 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-controller/0.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404255 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="446386ecc8985c225115f5f5270aa4e5dc6f01c72c0e5763b832e06920890368" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404290 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404300 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404307 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404313 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404321 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe" exitCode=0 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404319 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"446386ecc8985c225115f5f5270aa4e5dc6f01c72c0e5763b832e06920890368"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404352 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404362 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404372 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404381 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404388 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404396 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404326 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613" exitCode=143 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404416 4565 generic.go:334] "Generic (PLEG): container finished" podID="23e95c48-8d61-4222-a968-b86203ef8aab" containerID="9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa" exitCode=143 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404432 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" event={"ID":"23e95c48-8d61-4222-a968-b86203ef8aab","Type":"ContainerDied","Data":"9a23e7275baf68cf079137414bbfda4c84774b130006cf9b18511cc0fa55d9d9"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404441 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a23e7275baf68cf079137414bbfda4c84774b130006cf9b18511cc0fa55d9d9" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.404416 4565 scope.go:117] "RemoveContainer" containerID="7b95d45578a8935128ba3e8834bef55a7ffce6465b6e61628550da58ad22e576" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.406415 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/2.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.406829 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/1.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.406870 4565 generic.go:334] "Generic (PLEG): container finished" podID="6d96c20a-2514-47cf-99ec-a314bacac513" containerID="1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1" exitCode=2 Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.406886 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerDied","Data":"1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1"} Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.408076 4565 scope.go:117] "RemoveContainer" containerID="1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.408272 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jpfp5_openshift-multus(6d96c20a-2514-47cf-99ec-a314bacac513)\"" pod="openshift-multus/multus-jpfp5" podUID="6d96c20a-2514-47cf-99ec-a314bacac513" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.418959 4565 scope.go:117] "RemoveContainer" containerID="a957dd6a78e51bbac2e5e91939083721f8d1b1efcb75f447880a584b6af8c59e" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.432344 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-acl-logging/0.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.432718 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-controller/0.log" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.433134 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.436819 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.436971 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437080 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdwbt\" (UniqueName: \"kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437712 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.436941 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437048 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437766 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437794 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437869 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437884 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437899 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437917 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437948 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437963 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437977 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.437984 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438000 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438020 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket" (OuterVolumeSpecName: "log-socket") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438033 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438071 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438091 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438106 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438132 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438443 4565 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438459 4565 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438468 4565 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438476 4565 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438483 4565 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-log-socket\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438498 4565 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438697 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438777 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438879 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log" (OuterVolumeSpecName: "node-log") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.438913 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.439021 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash" (OuterVolumeSpecName: "host-slash") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.439186 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.439264 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.439365 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.441793 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.442152 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt" (OuterVolumeSpecName: "kube-api-access-mdwbt") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "kube-api-access-mdwbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.447611 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.467285 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sfl6v"] Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.467561 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="northd" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.467638 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="northd" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.467690 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.467732 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.467773 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.467811 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.467851 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.467894 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.467965 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468011 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468053 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="nbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468091 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="nbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468152 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-node" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468204 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-node" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468247 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="sbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468288 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="sbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468329 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kubecfg-setup" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468369 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kubecfg-setup" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468411 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-acl-logging" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468454 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-acl-logging" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468496 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468535 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468578 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468617 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: E1125 09:13:19.468659 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468697 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468842 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-acl-logging" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.468958 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovn-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469007 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469084 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="sbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469147 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469191 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469243 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="northd" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469304 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="nbdb" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469347 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469388 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="kube-rbac-proxy-node" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469576 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.469625 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" containerName="ovnkube-controller" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.471263 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539223 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539261 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539281 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"23e95c48-8d61-4222-a968-b86203ef8aab\" (UID: \"23e95c48-8d61-4222-a968-b86203ef8aab\") " Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539326 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539335 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539384 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "23e95c48-8d61-4222-a968-b86203ef8aab" (UID: "23e95c48-8d61-4222-a968-b86203ef8aab"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539454 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-env-overrides\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539472 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-node-log\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539496 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-kubelet\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539512 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-config\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539529 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-netd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539612 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-var-lib-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539634 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-log-socket\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539662 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-slash\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539681 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539700 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-systemd-units\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539733 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539755 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-etc-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539783 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85f2e5bc-ce70-4995-a428-f363b06b5873-ovn-node-metrics-cert\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539798 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-script-lib\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539824 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-ovn\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539844 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-systemd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539891 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmndv\" (UniqueName: \"kubernetes.io/projected/85f2e5bc-ce70-4995-a428-f363b06b5873-kube-api-access-vmndv\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.539959 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-bin\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540001 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-netns\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540052 4565 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-node-log\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540067 4565 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23e95c48-8d61-4222-a968-b86203ef8aab-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540076 4565 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540084 4565 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540092 4565 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-slash\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540100 4565 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540119 4565 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540128 4565 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540136 4565 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540147 4565 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540157 4565 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540165 4565 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e95c48-8d61-4222-a968-b86203ef8aab-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540174 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdwbt\" (UniqueName: \"kubernetes.io/projected/23e95c48-8d61-4222-a968-b86203ef8aab-kube-api-access-mdwbt\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.540181 4565 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23e95c48-8d61-4222-a968-b86203ef8aab-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640539 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-slash\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640572 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-systemd-units\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640589 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640607 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640621 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-etc-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640643 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85f2e5bc-ce70-4995-a428-f363b06b5873-ovn-node-metrics-cert\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640656 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-script-lib\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640673 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-systemd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640687 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-ovn\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640701 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640721 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmndv\" (UniqueName: \"kubernetes.io/projected/85f2e5bc-ce70-4995-a428-f363b06b5873-kube-api-access-vmndv\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640736 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-bin\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640750 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-netns\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640769 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-env-overrides\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640784 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-node-log\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640805 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-kubelet\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-config\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640830 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-netd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640847 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-var-lib-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640858 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-log-socket\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640905 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-log-socket\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640948 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-slash\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640969 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-systemd-units\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.640986 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641003 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641025 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-netns\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641088 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-bin\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641183 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-node-log\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641243 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-kubelet\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641273 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-cni-netd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641299 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-etc-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641440 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641480 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-ovn\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641487 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-env-overrides\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641508 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-var-lib-openvswitch\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641786 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-config\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641905 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85f2e5bc-ce70-4995-a428-f363b06b5873-ovnkube-script-lib\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.641976 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85f2e5bc-ce70-4995-a428-f363b06b5873-run-systemd\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.643544 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85f2e5bc-ce70-4995-a428-f363b06b5873-ovn-node-metrics-cert\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.653975 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmndv\" (UniqueName: \"kubernetes.io/projected/85f2e5bc-ce70-4995-a428-f363b06b5873-kube-api-access-vmndv\") pod \"ovnkube-node-sfl6v\" (UID: \"85f2e5bc-ce70-4995-a428-f363b06b5873\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:19 crc kubenswrapper[4565]: I1125 09:13:19.783348 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.412532 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/2.log" Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.415887 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-acl-logging/0.log" Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.416290 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vk74d_23e95c48-8d61-4222-a968-b86203ef8aab/ovn-controller/0.log" Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.416607 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vk74d" Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.417576 4565 generic.go:334] "Generic (PLEG): container finished" podID="85f2e5bc-ce70-4995-a428-f363b06b5873" containerID="7a9409e1c0b0304360af24a473087e0f6417d2bf3709d130a4ee42d4719b499f" exitCode=0 Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.417605 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerDied","Data":"7a9409e1c0b0304360af24a473087e0f6417d2bf3709d130a4ee42d4719b499f"} Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.417623 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"17ad1328ed1a672938e0bd30487abeee22f7adbc1d725f1822f5f40c0d8bf15e"} Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.455371 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vk74d"] Nov 25 09:13:20 crc kubenswrapper[4565]: I1125 09:13:20.461122 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vk74d"] Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.109108 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e95c48-8d61-4222-a968-b86203ef8aab" path="/var/lib/kubelet/pods/23e95c48-8d61-4222-a968-b86203ef8aab/volumes" Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424525 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"22280a5aa639e505b8fecf43a1c08915a6107b7d00256248d1163b17865e18c5"} Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424565 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"80d19cc8d860c901b52ac61b8a98eca987fb298f3b64bd885af0f19825f389f6"} Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424574 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"b0b653b6759393ea7d4e7a7a687b0fcda9924197915c144abe3689550f3ab6a5"} Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424599 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"b989c6e8ba97b1ed5b44ffe920ce930b18f0245d068055ef3995385db9c4d3b4"} Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424607 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"f7a48226d2b265d392d482692b353098433b727394236d821fe33438e72c60c5"} Nov 25 09:13:21 crc kubenswrapper[4565]: I1125 09:13:21.424614 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"fc36a14b6e69b41caccff20dc635502bdec1fc06fee30eb66909ec1a09d889c7"} Nov 25 09:13:23 crc kubenswrapper[4565]: I1125 09:13:23.434883 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"288ed45ffb49a83d42e14c1c3233a4d5e8d1b3a1c77528850d147b0844ec1155"} Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.099080 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.099242 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.445094 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" event={"ID":"85f2e5bc-ce70-4995-a428-f363b06b5873","Type":"ContainerStarted","Data":"5eeee19a2c0111c19d33220c7d2da4fa31a3b92b8611a5d10a517ec3c7232849"} Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.445373 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.445403 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.467399 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:25 crc kubenswrapper[4565]: I1125 09:13:25.479437 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" podStartSLOduration=6.479427721 podStartE2EDuration="6.479427721s" podCreationTimestamp="2025-11-25 09:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:13:25.478011798 +0000 UTC m=+538.680506936" watchObservedRunningTime="2025-11-25 09:13:25.479427721 +0000 UTC m=+538.681922859" Nov 25 09:13:26 crc kubenswrapper[4565]: I1125 09:13:26.448946 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:26 crc kubenswrapper[4565]: I1125 09:13:26.469245 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.188447 4565 scope.go:117] "RemoveContainer" containerID="30d4d40fcb45429e10204a8bf666e8cf34048c41d734c494ea549c13597501b5" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.197618 4565 scope.go:117] "RemoveContainer" containerID="90fdc80a6912cba20cb017904207c5b2d223db3dad021350f5114fb76ffbdffe" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.208763 4565 scope.go:117] "RemoveContainer" containerID="3cd0e2aa5e1ec7306ce3c93579be9a94d0d84e7173302e6f993c7459bf63cb9a" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.217596 4565 scope.go:117] "RemoveContainer" containerID="babab72e99050a0e943b339c8337ea7fb1aaafb9cdad1778a849d69da731d678" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.229434 4565 scope.go:117] "RemoveContainer" containerID="070291e47201f26533bd04c85d9b276ec4f81631b5f3b5529e0e8f2e1e9a7a5a" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.238633 4565 scope.go:117] "RemoveContainer" containerID="9baf15dea94b6744967c5d1353f143be701435678f639f06fa4ff764ca916daa" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.253512 4565 scope.go:117] "RemoveContainer" containerID="446386ecc8985c225115f5f5270aa4e5dc6f01c72c0e5763b832e06920890368" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.269268 4565 scope.go:117] "RemoveContainer" containerID="2631998686d20ad098a64ef2a370edb0572ec6012adab7ee022e7dd410f4f11c" Nov 25 09:13:27 crc kubenswrapper[4565]: I1125 09:13:27.277019 4565 scope.go:117] "RemoveContainer" containerID="7734e750eb3ca264b9d3650e39d468e5fc7dd4dd4a507367db6ed0bb00d7b613" Nov 25 09:13:33 crc kubenswrapper[4565]: I1125 09:13:33.098374 4565 scope.go:117] "RemoveContainer" containerID="1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1" Nov 25 09:13:33 crc kubenswrapper[4565]: E1125 09:13:33.099610 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jpfp5_openshift-multus(6d96c20a-2514-47cf-99ec-a314bacac513)\"" pod="openshift-multus/multus-jpfp5" podUID="6d96c20a-2514-47cf-99ec-a314bacac513" Nov 25 09:13:44 crc kubenswrapper[4565]: I1125 09:13:44.096639 4565 scope.go:117] "RemoveContainer" containerID="1d74c60a772dcdfd7245f2525a3085bb205b1fe6acde268e9f4df5e531c33ae1" Nov 25 09:13:44 crc kubenswrapper[4565]: I1125 09:13:44.515064 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jpfp5_6d96c20a-2514-47cf-99ec-a314bacac513/kube-multus/2.log" Nov 25 09:13:44 crc kubenswrapper[4565]: I1125 09:13:44.515243 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jpfp5" event={"ID":"6d96c20a-2514-47cf-99ec-a314bacac513","Type":"ContainerStarted","Data":"e7a3619067cc406f63ef4b95a90a6224edb9688d10a31b491c6a5f4f424c815c"} Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.368507 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9"] Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.369496 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.371537 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.381165 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9"] Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.416224 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.416283 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.416330 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz45k\" (UniqueName: \"kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.516908 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.516989 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.517046 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz45k\" (UniqueName: \"kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.517360 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.517394 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.535479 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz45k\" (UniqueName: \"kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:48 crc kubenswrapper[4565]: I1125 09:13:48.686655 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:49 crc kubenswrapper[4565]: I1125 09:13:49.043093 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9"] Nov 25 09:13:49 crc kubenswrapper[4565]: W1125 09:13:49.048348 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod950c1190_c404_483f_bb4a_5a3fe7548ccf.slice/crio-812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b WatchSource:0}: Error finding container 812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b: Status 404 returned error can't find the container with id 812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b Nov 25 09:13:49 crc kubenswrapper[4565]: I1125 09:13:49.540976 4565 generic.go:334] "Generic (PLEG): container finished" podID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerID="e0bc5465929cac3a34600fafc7073cf990f1b6544dd2c656725e1d79bd48fc9b" exitCode=0 Nov 25 09:13:49 crc kubenswrapper[4565]: I1125 09:13:49.541089 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" event={"ID":"950c1190-c404-483f-bb4a-5a3fe7548ccf","Type":"ContainerDied","Data":"e0bc5465929cac3a34600fafc7073cf990f1b6544dd2c656725e1d79bd48fc9b"} Nov 25 09:13:49 crc kubenswrapper[4565]: I1125 09:13:49.541148 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" event={"ID":"950c1190-c404-483f-bb4a-5a3fe7548ccf","Type":"ContainerStarted","Data":"812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b"} Nov 25 09:13:49 crc kubenswrapper[4565]: I1125 09:13:49.801165 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfl6v" Nov 25 09:13:51 crc kubenswrapper[4565]: I1125 09:13:51.553780 4565 generic.go:334] "Generic (PLEG): container finished" podID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerID="d32672db90010d67df690ab21ce9806a25edfd5cda2b74cdacaed0ece091780e" exitCode=0 Nov 25 09:13:51 crc kubenswrapper[4565]: I1125 09:13:51.553849 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" event={"ID":"950c1190-c404-483f-bb4a-5a3fe7548ccf","Type":"ContainerDied","Data":"d32672db90010d67df690ab21ce9806a25edfd5cda2b74cdacaed0ece091780e"} Nov 25 09:13:52 crc kubenswrapper[4565]: I1125 09:13:52.563324 4565 generic.go:334] "Generic (PLEG): container finished" podID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerID="1a744dad129c9dc1c62e5571ee8cf5d0093364f993111ee3ca81cfd33f389d75" exitCode=0 Nov 25 09:13:52 crc kubenswrapper[4565]: I1125 09:13:52.563371 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" event={"ID":"950c1190-c404-483f-bb4a-5a3fe7548ccf","Type":"ContainerDied","Data":"1a744dad129c9dc1c62e5571ee8cf5d0093364f993111ee3ca81cfd33f389d75"} Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.754300 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.880952 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle\") pod \"950c1190-c404-483f-bb4a-5a3fe7548ccf\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.881048 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz45k\" (UniqueName: \"kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k\") pod \"950c1190-c404-483f-bb4a-5a3fe7548ccf\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.881138 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util\") pod \"950c1190-c404-483f-bb4a-5a3fe7548ccf\" (UID: \"950c1190-c404-483f-bb4a-5a3fe7548ccf\") " Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.881503 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle" (OuterVolumeSpecName: "bundle") pod "950c1190-c404-483f-bb4a-5a3fe7548ccf" (UID: "950c1190-c404-483f-bb4a-5a3fe7548ccf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.886235 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k" (OuterVolumeSpecName: "kube-api-access-xz45k") pod "950c1190-c404-483f-bb4a-5a3fe7548ccf" (UID: "950c1190-c404-483f-bb4a-5a3fe7548ccf"). InnerVolumeSpecName "kube-api-access-xz45k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.893828 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util" (OuterVolumeSpecName: "util") pod "950c1190-c404-483f-bb4a-5a3fe7548ccf" (UID: "950c1190-c404-483f-bb4a-5a3fe7548ccf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.982496 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz45k\" (UniqueName: \"kubernetes.io/projected/950c1190-c404-483f-bb4a-5a3fe7548ccf-kube-api-access-xz45k\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.982523 4565 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-util\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:53 crc kubenswrapper[4565]: I1125 09:13:53.982536 4565 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/950c1190-c404-483f-bb4a-5a3fe7548ccf-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:13:54 crc kubenswrapper[4565]: I1125 09:13:54.573736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" event={"ID":"950c1190-c404-483f-bb4a-5a3fe7548ccf","Type":"ContainerDied","Data":"812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b"} Nov 25 09:13:54 crc kubenswrapper[4565]: I1125 09:13:54.573986 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="812e59476e32b1bfd0f09cecf368f6352a149007727d3b3d4675413868286b5b" Nov 25 09:13:54 crc kubenswrapper[4565]: I1125 09:13:54.573791 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9" Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.099063 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.099105 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.101421 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.101765 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.101802 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f" gracePeriod=600 Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.580262 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f" exitCode=0 Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.580352 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f"} Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.580662 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71"} Nov 25 09:13:55 crc kubenswrapper[4565]: I1125 09:13:55.580691 4565 scope.go:117] "RemoveContainer" containerID="cf382e7b4350d947c6ba89dc567329878d4281ac0e389a463a63fd9a1cf7db93" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.176295 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xvfww"] Nov 25 09:13:56 crc kubenswrapper[4565]: E1125 09:13:56.177431 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="util" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.177499 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="util" Nov 25 09:13:56 crc kubenswrapper[4565]: E1125 09:13:56.177559 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="extract" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.177609 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="extract" Nov 25 09:13:56 crc kubenswrapper[4565]: E1125 09:13:56.177660 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="pull" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.177711 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="pull" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.177839 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="950c1190-c404-483f-bb4a-5a3fe7548ccf" containerName="extract" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.178228 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.180141 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.180551 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.180623 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bbdc8" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.191168 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xvfww"] Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.208098 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2c5\" (UniqueName: \"kubernetes.io/projected/63b01418-682e-4ebe-874d-aab5928c222a-kube-api-access-mb2c5\") pod \"nmstate-operator-557fdffb88-xvfww\" (UID: \"63b01418-682e-4ebe-874d-aab5928c222a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.309299 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2c5\" (UniqueName: \"kubernetes.io/projected/63b01418-682e-4ebe-874d-aab5928c222a-kube-api-access-mb2c5\") pod \"nmstate-operator-557fdffb88-xvfww\" (UID: \"63b01418-682e-4ebe-874d-aab5928c222a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.341169 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2c5\" (UniqueName: \"kubernetes.io/projected/63b01418-682e-4ebe-874d-aab5928c222a-kube-api-access-mb2c5\") pod \"nmstate-operator-557fdffb88-xvfww\" (UID: \"63b01418-682e-4ebe-874d-aab5928c222a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.490651 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" Nov 25 09:13:56 crc kubenswrapper[4565]: I1125 09:13:56.642996 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xvfww"] Nov 25 09:13:56 crc kubenswrapper[4565]: W1125 09:13:56.651694 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b01418_682e_4ebe_874d_aab5928c222a.slice/crio-2566c2f72fce5e752f0ef4005b4eceab5dfbdb8b93988b44b5fe9c073eaf5a24 WatchSource:0}: Error finding container 2566c2f72fce5e752f0ef4005b4eceab5dfbdb8b93988b44b5fe9c073eaf5a24: Status 404 returned error can't find the container with id 2566c2f72fce5e752f0ef4005b4eceab5dfbdb8b93988b44b5fe9c073eaf5a24 Nov 25 09:13:57 crc kubenswrapper[4565]: I1125 09:13:57.592250 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" event={"ID":"63b01418-682e-4ebe-874d-aab5928c222a","Type":"ContainerStarted","Data":"2566c2f72fce5e752f0ef4005b4eceab5dfbdb8b93988b44b5fe9c073eaf5a24"} Nov 25 09:13:59 crc kubenswrapper[4565]: I1125 09:13:59.604303 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" event={"ID":"63b01418-682e-4ebe-874d-aab5928c222a","Type":"ContainerStarted","Data":"0ffb73c742b951e8dcca7a3871cd7002fd3994b4781e5601c94d917a5f279f45"} Nov 25 09:13:59 crc kubenswrapper[4565]: I1125 09:13:59.619314 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-xvfww" podStartSLOduration=1.245426786 podStartE2EDuration="3.619288184s" podCreationTimestamp="2025-11-25 09:13:56 +0000 UTC" firstStartedPulling="2025-11-25 09:13:56.655683306 +0000 UTC m=+569.858178444" lastFinishedPulling="2025-11-25 09:13:59.029544705 +0000 UTC m=+572.232039842" observedRunningTime="2025-11-25 09:13:59.615488099 +0000 UTC m=+572.817983238" watchObservedRunningTime="2025-11-25 09:13:59.619288184 +0000 UTC m=+572.821783322" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.390337 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.391689 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.396159 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xmrf5" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.419820 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.437908 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.438425 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.442850 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.461444 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.461486 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zp2r\" (UniqueName: \"kubernetes.io/projected/5745386c-25f6-4be7-bdc7-c299e25185d4-kube-api-access-6zp2r\") pod \"nmstate-metrics-5dcf9c57c5-gfhjf\" (UID: \"5745386c-25f6-4be7-bdc7-c299e25185d4\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.461519 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswvr\" (UniqueName: \"kubernetes.io/projected/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-kube-api-access-gswvr\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.468781 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b57v9"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.469641 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.508890 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562559 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562600 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-ovs-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562617 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwk7t\" (UniqueName: \"kubernetes.io/projected/26159181-25a7-4f96-8bf7-059faaff18e0-kube-api-access-cwk7t\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562640 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-nmstate-lock\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562656 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zp2r\" (UniqueName: \"kubernetes.io/projected/5745386c-25f6-4be7-bdc7-c299e25185d4-kube-api-access-6zp2r\") pod \"nmstate-metrics-5dcf9c57c5-gfhjf\" (UID: \"5745386c-25f6-4be7-bdc7-c299e25185d4\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562674 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-dbus-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.562696 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswvr\" (UniqueName: \"kubernetes.io/projected/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-kube-api-access-gswvr\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: E1125 09:14:00.562975 4565 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 25 09:14:00 crc kubenswrapper[4565]: E1125 09:14:00.563122 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair podName:e12b0906-c6a2-468a-8bc1-a29bda6a25e3 nodeName:}" failed. No retries permitted until 2025-11-25 09:14:01.063104816 +0000 UTC m=+574.265599955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair") pod "nmstate-webhook-6b89b748d8-bgc56" (UID: "e12b0906-c6a2-468a-8bc1-a29bda6a25e3") : secret "openshift-nmstate-webhook" not found Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.582281 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zp2r\" (UniqueName: \"kubernetes.io/projected/5745386c-25f6-4be7-bdc7-c299e25185d4-kube-api-access-6zp2r\") pod \"nmstate-metrics-5dcf9c57c5-gfhjf\" (UID: \"5745386c-25f6-4be7-bdc7-c299e25185d4\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.596553 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswvr\" (UniqueName: \"kubernetes.io/projected/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-kube-api-access-gswvr\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.651485 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.652108 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.655039 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.655057 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.655413 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4kxdq" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664548 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-dbus-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664619 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664657 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djk2c\" (UniqueName: \"kubernetes.io/projected/538d2898-95e6-4651-89d1-d5cb979d7aab-kube-api-access-djk2c\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664685 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/538d2898-95e6-4651-89d1-d5cb979d7aab-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664713 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-ovs-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664727 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwk7t\" (UniqueName: \"kubernetes.io/projected/26159181-25a7-4f96-8bf7-059faaff18e0-kube-api-access-cwk7t\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664744 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-nmstate-lock\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.664794 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-nmstate-lock\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.665082 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-dbus-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.665543 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/26159181-25a7-4f96-8bf7-059faaff18e0-ovs-socket\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.691362 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwk7t\" (UniqueName: \"kubernetes.io/projected/26159181-25a7-4f96-8bf7-059faaff18e0-kube-api-access-cwk7t\") pod \"nmstate-handler-b57v9\" (UID: \"26159181-25a7-4f96-8bf7-059faaff18e0\") " pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.704100 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.706367 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.765535 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.765844 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djk2c\" (UniqueName: \"kubernetes.io/projected/538d2898-95e6-4651-89d1-d5cb979d7aab-kube-api-access-djk2c\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.765873 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/538d2898-95e6-4651-89d1-d5cb979d7aab-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: E1125 09:14:00.765697 4565 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 25 09:14:00 crc kubenswrapper[4565]: E1125 09:14:00.765960 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert podName:538d2898-95e6-4651-89d1-d5cb979d7aab nodeName:}" failed. No retries permitted until 2025-11-25 09:14:01.265940898 +0000 UTC m=+574.468436037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-d25tx" (UID: "538d2898-95e6-4651-89d1-d5cb979d7aab") : secret "plugin-serving-cert" not found Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.766661 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/538d2898-95e6-4651-89d1-d5cb979d7aab-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.780639 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.790895 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djk2c\" (UniqueName: \"kubernetes.io/projected/538d2898-95e6-4651-89d1-d5cb979d7aab-kube-api-access-djk2c\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.877635 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78d9f5c977-tx2tx"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.878276 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.889832 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d9f5c977-tx2tx"] Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973347 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-service-ca\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973412 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-oauth-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973446 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-trusted-ca-bundle\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973503 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4t4\" (UniqueName: \"kubernetes.io/projected/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-kube-api-access-4r4t4\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973733 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-oauth-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973786 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.973848 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:00 crc kubenswrapper[4565]: I1125 09:14:00.980672 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf"] Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075264 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-service-ca\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075338 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-oauth-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075366 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-trusted-ca-bundle\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075401 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4t4\" (UniqueName: \"kubernetes.io/projected/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-kube-api-access-4r4t4\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075427 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075465 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-oauth-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075491 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.075523 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.076353 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-oauth-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.077067 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-service-ca\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.077320 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.078310 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-trusted-ca-bundle\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.081456 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-oauth-config\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.081472 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-console-serving-cert\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.081673 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e12b0906-c6a2-468a-8bc1-a29bda6a25e3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bgc56\" (UID: \"e12b0906-c6a2-468a-8bc1-a29bda6a25e3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.090540 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4t4\" (UniqueName: \"kubernetes.io/projected/2ab90e6a-d60d-417e-bab4-f0c6d7e7855e-kube-api-access-4r4t4\") pod \"console-78d9f5c977-tx2tx\" (UID: \"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e\") " pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.201256 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.278442 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.283408 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/538d2898-95e6-4651-89d1-d5cb979d7aab-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-d25tx\" (UID: \"538d2898-95e6-4651-89d1-d5cb979d7aab\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.349638 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.386868 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d9f5c977-tx2tx"] Nov 25 09:14:01 crc kubenswrapper[4565]: W1125 09:14:01.392210 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab90e6a_d60d_417e_bab4_f0c6d7e7855e.slice/crio-430fc444db06bea2654f4dcaac7e6131459130114eac165dd4e24d0c61d12258 WatchSource:0}: Error finding container 430fc444db06bea2654f4dcaac7e6131459130114eac165dd4e24d0c61d12258: Status 404 returned error can't find the container with id 430fc444db06bea2654f4dcaac7e6131459130114eac165dd4e24d0c61d12258 Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.553837 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56"] Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.564821 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.616467 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" event={"ID":"5745386c-25f6-4be7-bdc7-c299e25185d4","Type":"ContainerStarted","Data":"9d36f869163413b5efc142afa8313d572e8bd6a3d0d80aa19ef5607e5389b838"} Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.617977 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b57v9" event={"ID":"26159181-25a7-4f96-8bf7-059faaff18e0","Type":"ContainerStarted","Data":"9cf559b7f3d7b020c802e4ecd28c4ab10d950561db01e9c98a56e530ff2a603a"} Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.619236 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d9f5c977-tx2tx" event={"ID":"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e","Type":"ContainerStarted","Data":"bc478d72a44414fe2b38e0c28f3d185af831addb72291359dc866817313c561e"} Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.619275 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d9f5c977-tx2tx" event={"ID":"2ab90e6a-d60d-417e-bab4-f0c6d7e7855e","Type":"ContainerStarted","Data":"430fc444db06bea2654f4dcaac7e6131459130114eac165dd4e24d0c61d12258"} Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.621643 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" event={"ID":"e12b0906-c6a2-468a-8bc1-a29bda6a25e3","Type":"ContainerStarted","Data":"4a4ae91f372c77784cfd4f650fa979a8d2de7ceaf56a7fbb5d55913ac8ae1e99"} Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.638875 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78d9f5c977-tx2tx" podStartSLOduration=1.638851246 podStartE2EDuration="1.638851246s" podCreationTimestamp="2025-11-25 09:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:14:01.638351193 +0000 UTC m=+574.840846331" watchObservedRunningTime="2025-11-25 09:14:01.638851246 +0000 UTC m=+574.841346384" Nov 25 09:14:01 crc kubenswrapper[4565]: I1125 09:14:01.709294 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx"] Nov 25 09:14:02 crc kubenswrapper[4565]: I1125 09:14:02.627323 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" event={"ID":"538d2898-95e6-4651-89d1-d5cb979d7aab","Type":"ContainerStarted","Data":"9326d49c320ccb7d39211adb0cd64d2bfba79a994e93cc55b7a74ef3c1d88391"} Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.638873 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" event={"ID":"538d2898-95e6-4651-89d1-d5cb979d7aab","Type":"ContainerStarted","Data":"1b66533c39a1f4733fad0820b30aac59e78a7801f0940eb5305442229e11975c"} Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.641089 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" event={"ID":"5745386c-25f6-4be7-bdc7-c299e25185d4","Type":"ContainerStarted","Data":"9a41feb92c986f22f9d0f56e16a260bcdcc68423aafe6505c89185f79221e3ca"} Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.642337 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b57v9" event={"ID":"26159181-25a7-4f96-8bf7-059faaff18e0","Type":"ContainerStarted","Data":"82c23d4ab0f227c4134e45095194dcda7442adeda709e54ac0f45b22d7b54162"} Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.642473 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.644294 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" event={"ID":"e12b0906-c6a2-468a-8bc1-a29bda6a25e3","Type":"ContainerStarted","Data":"3e6c3c7a7985b1a7093fdedb2297b9d7f31448263d245c0d8554b9e69c196006"} Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.644487 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.655833 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-d25tx" podStartSLOduration=2.15799524 podStartE2EDuration="4.655784669s" podCreationTimestamp="2025-11-25 09:14:00 +0000 UTC" firstStartedPulling="2025-11-25 09:14:01.722536384 +0000 UTC m=+574.925031522" lastFinishedPulling="2025-11-25 09:14:04.220325814 +0000 UTC m=+577.422820951" observedRunningTime="2025-11-25 09:14:04.651681463 +0000 UTC m=+577.854176601" watchObservedRunningTime="2025-11-25 09:14:04.655784669 +0000 UTC m=+577.858279807" Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.666361 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b57v9" podStartSLOduration=1.293919989 podStartE2EDuration="4.666352172s" podCreationTimestamp="2025-11-25 09:14:00 +0000 UTC" firstStartedPulling="2025-11-25 09:14:00.815961817 +0000 UTC m=+574.018456955" lastFinishedPulling="2025-11-25 09:14:04.188393999 +0000 UTC m=+577.390889138" observedRunningTime="2025-11-25 09:14:04.663004791 +0000 UTC m=+577.865499929" watchObservedRunningTime="2025-11-25 09:14:04.666352172 +0000 UTC m=+577.868847311" Nov 25 09:14:04 crc kubenswrapper[4565]: I1125 09:14:04.680702 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" podStartSLOduration=2.017004306 podStartE2EDuration="4.68068807s" podCreationTimestamp="2025-11-25 09:14:00 +0000 UTC" firstStartedPulling="2025-11-25 09:14:01.547163279 +0000 UTC m=+574.749658417" lastFinishedPulling="2025-11-25 09:14:04.210847043 +0000 UTC m=+577.413342181" observedRunningTime="2025-11-25 09:14:04.678426327 +0000 UTC m=+577.880921465" watchObservedRunningTime="2025-11-25 09:14:04.68068807 +0000 UTC m=+577.883183209" Nov 25 09:14:06 crc kubenswrapper[4565]: I1125 09:14:06.658909 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" event={"ID":"5745386c-25f6-4be7-bdc7-c299e25185d4","Type":"ContainerStarted","Data":"5fca7bd8dca3fca30d6df4071e589503aac1c645fcaf36f20b4e4a181a80fe1b"} Nov 25 09:14:06 crc kubenswrapper[4565]: I1125 09:14:06.676860 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-gfhjf" podStartSLOduration=1.420913893 podStartE2EDuration="6.676845715s" podCreationTimestamp="2025-11-25 09:14:00 +0000 UTC" firstStartedPulling="2025-11-25 09:14:00.987035706 +0000 UTC m=+574.189530845" lastFinishedPulling="2025-11-25 09:14:06.242967529 +0000 UTC m=+579.445462667" observedRunningTime="2025-11-25 09:14:06.676331435 +0000 UTC m=+579.878826572" watchObservedRunningTime="2025-11-25 09:14:06.676845715 +0000 UTC m=+579.879340843" Nov 25 09:14:10 crc kubenswrapper[4565]: I1125 09:14:10.796529 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b57v9" Nov 25 09:14:11 crc kubenswrapper[4565]: I1125 09:14:11.202267 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:11 crc kubenswrapper[4565]: I1125 09:14:11.202363 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:11 crc kubenswrapper[4565]: I1125 09:14:11.207113 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:11 crc kubenswrapper[4565]: I1125 09:14:11.682762 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78d9f5c977-tx2tx" Nov 25 09:14:11 crc kubenswrapper[4565]: I1125 09:14:11.720120 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:14:21 crc kubenswrapper[4565]: I1125 09:14:21.355244 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bgc56" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.300516 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m"] Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.302131 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.303585 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.305772 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m"] Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.475665 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.475742 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.475764 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7km\" (UniqueName: \"kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.576358 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.576438 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.576458 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7km\" (UniqueName: \"kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.576886 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.576959 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.592306 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7km\" (UniqueName: \"kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.614466 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:30 crc kubenswrapper[4565]: I1125 09:14:30.945246 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m"] Nov 25 09:14:31 crc kubenswrapper[4565]: I1125 09:14:31.770707 4565 generic.go:334] "Generic (PLEG): container finished" podID="f291e538-ed14-4652-bf08-0b52ac487353" containerID="fa3d2f18c97e34db001d006058a990a0112d6dee3520d257bad22ebe00dac096" exitCode=0 Nov 25 09:14:31 crc kubenswrapper[4565]: I1125 09:14:31.770968 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" event={"ID":"f291e538-ed14-4652-bf08-0b52ac487353","Type":"ContainerDied","Data":"fa3d2f18c97e34db001d006058a990a0112d6dee3520d257bad22ebe00dac096"} Nov 25 09:14:31 crc kubenswrapper[4565]: I1125 09:14:31.770993 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" event={"ID":"f291e538-ed14-4652-bf08-0b52ac487353","Type":"ContainerStarted","Data":"a0c0cd411d16d2e5a5b1f5f63d95a52d9bfeb7d43987b1536c04842634e29218"} Nov 25 09:14:33 crc kubenswrapper[4565]: I1125 09:14:33.780200 4565 generic.go:334] "Generic (PLEG): container finished" podID="f291e538-ed14-4652-bf08-0b52ac487353" containerID="f71c8d34fff48fc1a8aa4dc9d43a8217138dba3812bce97cea2163e3afd559b2" exitCode=0 Nov 25 09:14:33 crc kubenswrapper[4565]: I1125 09:14:33.780292 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" event={"ID":"f291e538-ed14-4652-bf08-0b52ac487353","Type":"ContainerDied","Data":"f71c8d34fff48fc1a8aa4dc9d43a8217138dba3812bce97cea2163e3afd559b2"} Nov 25 09:14:34 crc kubenswrapper[4565]: I1125 09:14:34.786257 4565 generic.go:334] "Generic (PLEG): container finished" podID="f291e538-ed14-4652-bf08-0b52ac487353" containerID="88c627f9a8a3972a68b30dc8b13bbf8bb922bc85144ea345a9e8f6d349a153f1" exitCode=0 Nov 25 09:14:34 crc kubenswrapper[4565]: I1125 09:14:34.786291 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" event={"ID":"f291e538-ed14-4652-bf08-0b52ac487353","Type":"ContainerDied","Data":"88c627f9a8a3972a68b30dc8b13bbf8bb922bc85144ea345a9e8f6d349a153f1"} Nov 25 09:14:35 crc kubenswrapper[4565]: I1125 09:14:35.947777 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.130621 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle\") pod \"f291e538-ed14-4652-bf08-0b52ac487353\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.130657 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv7km\" (UniqueName: \"kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km\") pod \"f291e538-ed14-4652-bf08-0b52ac487353\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.130676 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util\") pod \"f291e538-ed14-4652-bf08-0b52ac487353\" (UID: \"f291e538-ed14-4652-bf08-0b52ac487353\") " Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.131430 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle" (OuterVolumeSpecName: "bundle") pod "f291e538-ed14-4652-bf08-0b52ac487353" (UID: "f291e538-ed14-4652-bf08-0b52ac487353"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.135120 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km" (OuterVolumeSpecName: "kube-api-access-hv7km") pod "f291e538-ed14-4652-bf08-0b52ac487353" (UID: "f291e538-ed14-4652-bf08-0b52ac487353"). InnerVolumeSpecName "kube-api-access-hv7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.140476 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util" (OuterVolumeSpecName: "util") pod "f291e538-ed14-4652-bf08-0b52ac487353" (UID: "f291e538-ed14-4652-bf08-0b52ac487353"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.231309 4565 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.231417 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv7km\" (UniqueName: \"kubernetes.io/projected/f291e538-ed14-4652-bf08-0b52ac487353-kube-api-access-hv7km\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.231492 4565 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f291e538-ed14-4652-bf08-0b52ac487353-util\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.747573 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wvc8w" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" containerID="cri-o://db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1" gracePeriod=15 Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.795459 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" event={"ID":"f291e538-ed14-4652-bf08-0b52ac487353","Type":"ContainerDied","Data":"a0c0cd411d16d2e5a5b1f5f63d95a52d9bfeb7d43987b1536c04842634e29218"} Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.795496 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c0cd411d16d2e5a5b1f5f63d95a52d9bfeb7d43987b1536c04842634e29218" Nov 25 09:14:36 crc kubenswrapper[4565]: I1125 09:14:36.795502 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.017059 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wvc8w_418a0125-b167-49b8-b6bd-0c97a587107c/console/0.log" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.017248 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139369 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139586 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139637 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzts8\" (UniqueName: \"kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139664 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139682 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139706 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.139729 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert\") pod \"418a0125-b167-49b8-b6bd-0c97a587107c\" (UID: \"418a0125-b167-49b8-b6bd-0c97a587107c\") " Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.140402 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.140452 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config" (OuterVolumeSpecName: "console-config") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.140468 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca" (OuterVolumeSpecName: "service-ca") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.140489 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.143562 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.143905 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.144408 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8" (OuterVolumeSpecName: "kube-api-access-fzts8") pod "418a0125-b167-49b8-b6bd-0c97a587107c" (UID: "418a0125-b167-49b8-b6bd-0c97a587107c"). InnerVolumeSpecName "kube-api-access-fzts8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240464 4565 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240488 4565 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240498 4565 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240506 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzts8\" (UniqueName: \"kubernetes.io/projected/418a0125-b167-49b8-b6bd-0c97a587107c-kube-api-access-fzts8\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240517 4565 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418a0125-b167-49b8-b6bd-0c97a587107c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240525 4565 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.240532 4565 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418a0125-b167-49b8-b6bd-0c97a587107c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801594 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wvc8w_418a0125-b167-49b8-b6bd-0c97a587107c/console/0.log" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801638 4565 generic.go:334] "Generic (PLEG): container finished" podID="418a0125-b167-49b8-b6bd-0c97a587107c" containerID="db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1" exitCode=2 Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801663 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wvc8w" event={"ID":"418a0125-b167-49b8-b6bd-0c97a587107c","Type":"ContainerDied","Data":"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1"} Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801692 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wvc8w" event={"ID":"418a0125-b167-49b8-b6bd-0c97a587107c","Type":"ContainerDied","Data":"e7e84d1d403aace90911e374e0f88713e50ce0bb106a4a61448f5b0a249d1d11"} Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801707 4565 scope.go:117] "RemoveContainer" containerID="db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.801722 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wvc8w" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.813974 4565 scope.go:117] "RemoveContainer" containerID="db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1" Nov 25 09:14:37 crc kubenswrapper[4565]: E1125 09:14:37.814281 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1\": container with ID starting with db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1 not found: ID does not exist" containerID="db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.814307 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1"} err="failed to get container status \"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1\": rpc error: code = NotFound desc = could not find container \"db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1\": container with ID starting with db3f1e587c02b47ee982c0f70557746891ba360c0616efe70ac62c187b3e7dd1 not found: ID does not exist" Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.822529 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:14:37 crc kubenswrapper[4565]: I1125 09:14:37.824295 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wvc8w"] Nov 25 09:14:39 crc kubenswrapper[4565]: I1125 09:14:39.102559 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" path="/var/lib/kubelet/pods/418a0125-b167-49b8-b6bd-0c97a587107c/volumes" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.347922 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp"] Nov 25 09:14:46 crc kubenswrapper[4565]: E1125 09:14:46.348409 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="util" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348421 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="util" Nov 25 09:14:46 crc kubenswrapper[4565]: E1125 09:14:46.348432 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="pull" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348437 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="pull" Nov 25 09:14:46 crc kubenswrapper[4565]: E1125 09:14:46.348451 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348457 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" Nov 25 09:14:46 crc kubenswrapper[4565]: E1125 09:14:46.348464 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="extract" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348469 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="extract" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348552 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f291e538-ed14-4652-bf08-0b52ac487353" containerName="extract" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348561 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="418a0125-b167-49b8-b6bd-0c97a587107c" containerName="console" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.348854 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.350601 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.351145 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.351671 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-j2q7d" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.352977 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.353110 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.363849 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp"] Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.446042 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shxp\" (UniqueName: \"kubernetes.io/projected/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-kube-api-access-8shxp\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.446100 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-apiservice-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.446130 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-webhook-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.546778 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shxp\" (UniqueName: \"kubernetes.io/projected/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-kube-api-access-8shxp\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.546827 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-apiservice-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.546861 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-webhook-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.558724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-apiservice-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.559098 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-webhook-cert\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.559404 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shxp\" (UniqueName: \"kubernetes.io/projected/145e5d59-fd78-4bc1-a97c-17ebf0d67fa4-kube-api-access-8shxp\") pod \"metallb-operator-controller-manager-74454849f9-fjwfp\" (UID: \"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4\") " pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.650858 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd"] Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.651435 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.654461 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.654704 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hm54w" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.654819 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.660861 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.674511 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd"] Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.748634 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-webhook-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.748689 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.748765 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bbq7\" (UniqueName: \"kubernetes.io/projected/14fec6d4-6935-4283-944d-6a229b3cdc82-kube-api-access-8bbq7\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.850352 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bbq7\" (UniqueName: \"kubernetes.io/projected/14fec6d4-6935-4283-944d-6a229b3cdc82-kube-api-access-8bbq7\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.850639 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-webhook-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.850688 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.854534 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-webhook-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.861659 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14fec6d4-6935-4283-944d-6a229b3cdc82-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.869614 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bbq7\" (UniqueName: \"kubernetes.io/projected/14fec6d4-6935-4283-944d-6a229b3cdc82-kube-api-access-8bbq7\") pod \"metallb-operator-webhook-server-7fb8cb44b7-5dvrd\" (UID: \"14fec6d4-6935-4283-944d-6a229b3cdc82\") " pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:46 crc kubenswrapper[4565]: I1125 09:14:46.963447 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:47 crc kubenswrapper[4565]: I1125 09:14:47.063053 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp"] Nov 25 09:14:47 crc kubenswrapper[4565]: W1125 09:14:47.063468 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod145e5d59_fd78_4bc1_a97c_17ebf0d67fa4.slice/crio-eb1f0faae506db15faab3de0a8e8c1b24b8ea58571302317f6c6464f2010810b WatchSource:0}: Error finding container eb1f0faae506db15faab3de0a8e8c1b24b8ea58571302317f6c6464f2010810b: Status 404 returned error can't find the container with id eb1f0faae506db15faab3de0a8e8c1b24b8ea58571302317f6c6464f2010810b Nov 25 09:14:47 crc kubenswrapper[4565]: I1125 09:14:47.143431 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd"] Nov 25 09:14:47 crc kubenswrapper[4565]: W1125 09:14:47.150376 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14fec6d4_6935_4283_944d_6a229b3cdc82.slice/crio-1fedff6513128f4e18d5d905c2efae8a85835f2ebe883aa39cc755159e4730ec WatchSource:0}: Error finding container 1fedff6513128f4e18d5d905c2efae8a85835f2ebe883aa39cc755159e4730ec: Status 404 returned error can't find the container with id 1fedff6513128f4e18d5d905c2efae8a85835f2ebe883aa39cc755159e4730ec Nov 25 09:14:47 crc kubenswrapper[4565]: I1125 09:14:47.856734 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerStarted","Data":"eb1f0faae506db15faab3de0a8e8c1b24b8ea58571302317f6c6464f2010810b"} Nov 25 09:14:47 crc kubenswrapper[4565]: I1125 09:14:47.857753 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" event={"ID":"14fec6d4-6935-4283-944d-6a229b3cdc82","Type":"ContainerStarted","Data":"1fedff6513128f4e18d5d905c2efae8a85835f2ebe883aa39cc755159e4730ec"} Nov 25 09:14:49 crc kubenswrapper[4565]: I1125 09:14:49.866457 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerStarted","Data":"7030d146b2f7fe1b4e54b3787a2f3e7159659f8c7e0f85a057e86ee80e9a6ff9"} Nov 25 09:14:49 crc kubenswrapper[4565]: I1125 09:14:49.866606 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:14:49 crc kubenswrapper[4565]: I1125 09:14:49.885425 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" podStartSLOduration=1.504872143 podStartE2EDuration="3.885410901s" podCreationTimestamp="2025-11-25 09:14:46 +0000 UTC" firstStartedPulling="2025-11-25 09:14:47.067903756 +0000 UTC m=+620.270398894" lastFinishedPulling="2025-11-25 09:14:49.448442514 +0000 UTC m=+622.650937652" observedRunningTime="2025-11-25 09:14:49.882837658 +0000 UTC m=+623.085332796" watchObservedRunningTime="2025-11-25 09:14:49.885410901 +0000 UTC m=+623.087906039" Nov 25 09:14:51 crc kubenswrapper[4565]: I1125 09:14:51.875201 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" event={"ID":"14fec6d4-6935-4283-944d-6a229b3cdc82","Type":"ContainerStarted","Data":"55ae2d479f06962ea39ac486b0c0c74b3068e347bc537d59283178c674607527"} Nov 25 09:14:51 crc kubenswrapper[4565]: I1125 09:14:51.876315 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:14:51 crc kubenswrapper[4565]: I1125 09:14:51.888438 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" podStartSLOduration=1.9507075980000002 podStartE2EDuration="5.888428359s" podCreationTimestamp="2025-11-25 09:14:46 +0000 UTC" firstStartedPulling="2025-11-25 09:14:47.152628949 +0000 UTC m=+620.355124088" lastFinishedPulling="2025-11-25 09:14:51.090349711 +0000 UTC m=+624.292844849" observedRunningTime="2025-11-25 09:14:51.887612031 +0000 UTC m=+625.090107169" watchObservedRunningTime="2025-11-25 09:14:51.888428359 +0000 UTC m=+625.090923498" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.118677 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl"] Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.119697 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.121229 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.122744 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.129912 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl"] Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.293044 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.293136 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.293172 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlprg\" (UniqueName: \"kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.394669 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.394733 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.394756 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlprg\" (UniqueName: \"kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.395849 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.400552 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.407510 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlprg\" (UniqueName: \"kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg\") pod \"collect-profiles-29401035-8dbdl\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.445559 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:00 crc kubenswrapper[4565]: W1125 09:15:00.818122 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c99be3_6d18_46c5_b8db_831beb1eb80d.slice/crio-88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6 WatchSource:0}: Error finding container 88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6: Status 404 returned error can't find the container with id 88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6 Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.820996 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl"] Nov 25 09:15:00 crc kubenswrapper[4565]: I1125 09:15:00.914283 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" event={"ID":"62c99be3-6d18-46c5-b8db-831beb1eb80d","Type":"ContainerStarted","Data":"88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6"} Nov 25 09:15:01 crc kubenswrapper[4565]: I1125 09:15:01.919035 4565 generic.go:334] "Generic (PLEG): container finished" podID="62c99be3-6d18-46c5-b8db-831beb1eb80d" containerID="58d9e06d96c969f0322a4ee8258fda4dc74a72519d33495ce748c1c634fe025d" exitCode=0 Nov 25 09:15:01 crc kubenswrapper[4565]: I1125 09:15:01.919131 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" event={"ID":"62c99be3-6d18-46c5-b8db-831beb1eb80d","Type":"ContainerDied","Data":"58d9e06d96c969f0322a4ee8258fda4dc74a72519d33495ce748c1c634fe025d"} Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.098452 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.128227 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume\") pod \"62c99be3-6d18-46c5-b8db-831beb1eb80d\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.138064 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62c99be3-6d18-46c5-b8db-831beb1eb80d" (UID: "62c99be3-6d18-46c5-b8db-831beb1eb80d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.228881 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlprg\" (UniqueName: \"kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg\") pod \"62c99be3-6d18-46c5-b8db-831beb1eb80d\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.228996 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume\") pod \"62c99be3-6d18-46c5-b8db-831beb1eb80d\" (UID: \"62c99be3-6d18-46c5-b8db-831beb1eb80d\") " Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.229224 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c99be3-6d18-46c5-b8db-831beb1eb80d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.229546 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume" (OuterVolumeSpecName: "config-volume") pod "62c99be3-6d18-46c5-b8db-831beb1eb80d" (UID: "62c99be3-6d18-46c5-b8db-831beb1eb80d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.231046 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg" (OuterVolumeSpecName: "kube-api-access-rlprg") pod "62c99be3-6d18-46c5-b8db-831beb1eb80d" (UID: "62c99be3-6d18-46c5-b8db-831beb1eb80d"). InnerVolumeSpecName "kube-api-access-rlprg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.330044 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c99be3-6d18-46c5-b8db-831beb1eb80d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.330066 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlprg\" (UniqueName: \"kubernetes.io/projected/62c99be3-6d18-46c5-b8db-831beb1eb80d-kube-api-access-rlprg\") on node \"crc\" DevicePath \"\"" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.928581 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" event={"ID":"62c99be3-6d18-46c5-b8db-831beb1eb80d","Type":"ContainerDied","Data":"88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6"} Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.928859 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88d4553dda6667accfc7dcf94b070be22da1a87ecf60be622af4a590aeedeee6" Nov 25 09:15:03 crc kubenswrapper[4565]: I1125 09:15:03.928660 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl" Nov 25 09:15:06 crc kubenswrapper[4565]: I1125 09:15:06.967899 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7fb8cb44b7-5dvrd" Nov 25 09:15:26 crc kubenswrapper[4565]: I1125 09:15:26.662907 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.172817 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-82nts"] Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.173214 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c99be3-6d18-46c5-b8db-831beb1eb80d" containerName="collect-profiles" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.173232 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c99be3-6d18-46c5-b8db-831beb1eb80d" containerName="collect-profiles" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.173357 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c99be3-6d18-46c5-b8db-831beb1eb80d" containerName="collect-profiles" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.174943 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175717 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdfz\" (UniqueName: \"kubernetes.io/projected/2a3347c5-5075-4d8d-99fb-cd2468efe83d-kube-api-access-btdfz\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175776 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-startup\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175799 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-conf\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175847 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-sockets\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175865 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-reloader\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175902 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.175915 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.179149 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-b722z"] Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.179621 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.180220 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.180439 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kpqnd" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.180478 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.181297 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.190202 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-b722z"] Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.253404 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dr2xf"] Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.254193 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.256002 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.256008 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fjg7f" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.256056 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.257421 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.270700 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-gzvgq"] Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.271423 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.273291 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277544 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-sockets\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277573 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metallb-excludel2\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277591 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-reloader\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277607 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwl2c\" (UniqueName: \"kubernetes.io/projected/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-kube-api-access-jwl2c\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277633 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277650 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277671 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nh2x\" (UniqueName: \"kubernetes.io/projected/23dd1cf1-30ed-4fc1-9b32-70897895e05d-kube-api-access-2nh2x\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277685 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277700 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/23dd1cf1-30ed-4fc1-9b32-70897895e05d-cert\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277720 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-cert\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277751 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdfz\" (UniqueName: \"kubernetes.io/projected/2a3347c5-5075-4d8d-99fb-cd2468efe83d-kube-api-access-btdfz\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277767 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277783 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-startup\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277800 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-conf\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277815 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metrics-certs\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.277836 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6nbt\" (UniqueName: \"kubernetes.io/projected/2d23227d-3456-4a34-9aa7-4878c7ee4d37-kube-api-access-h6nbt\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.278173 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-reloader\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.278208 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-sockets\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.278332 4565 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.278357 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.278387 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs podName:2a3347c5-5075-4d8d-99fb-cd2468efe83d nodeName:}" failed. No retries permitted until 2025-11-25 09:15:27.778367926 +0000 UTC m=+660.980863085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs") pod "frr-k8s-82nts" (UID: "2a3347c5-5075-4d8d-99fb-cd2468efe83d") : secret "frr-k8s-certs-secret" not found Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.278522 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-conf\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.278945 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2a3347c5-5075-4d8d-99fb-cd2468efe83d-frr-startup\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.288148 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-gzvgq"] Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.294975 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdfz\" (UniqueName: \"kubernetes.io/projected/2a3347c5-5075-4d8d-99fb-cd2468efe83d-kube-api-access-btdfz\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.381384 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nh2x\" (UniqueName: \"kubernetes.io/projected/23dd1cf1-30ed-4fc1-9b32-70897895e05d-kube-api-access-2nh2x\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.381529 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.381636 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/23dd1cf1-30ed-4fc1-9b32-70897895e05d-cert\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.381740 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-cert\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.381864 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.382323 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metrics-certs\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.382808 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6nbt\" (UniqueName: \"kubernetes.io/projected/2d23227d-3456-4a34-9aa7-4878c7ee4d37-kube-api-access-h6nbt\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.382914 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metallb-excludel2\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.383066 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwl2c\" (UniqueName: \"kubernetes.io/projected/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-kube-api-access-jwl2c\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.382057 4565 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.383436 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist podName:1d95b8b8-1675-48ae-b497-ff3fcf0bbc42 nodeName:}" failed. No retries permitted until 2025-11-25 09:15:27.883420255 +0000 UTC m=+661.085915393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist") pod "speaker-dr2xf" (UID: "1d95b8b8-1675-48ae-b497-ff3fcf0bbc42") : secret "metallb-memberlist" not found Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.382187 4565 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.383561 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs podName:2d23227d-3456-4a34-9aa7-4878c7ee4d37 nodeName:}" failed. No retries permitted until 2025-11-25 09:15:27.883552614 +0000 UTC m=+661.086047753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs") pod "controller-6c7b4b5f48-gzvgq" (UID: "2d23227d-3456-4a34-9aa7-4878c7ee4d37") : secret "controller-certs-secret" not found Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.384225 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metallb-excludel2\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.384339 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.390420 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/23dd1cf1-30ed-4fc1-9b32-70897895e05d-cert\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.404069 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nh2x\" (UniqueName: \"kubernetes.io/projected/23dd1cf1-30ed-4fc1-9b32-70897895e05d-kube-api-access-2nh2x\") pod \"frr-k8s-webhook-server-6998585d5-b722z\" (UID: \"23dd1cf1-30ed-4fc1-9b32-70897895e05d\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.404540 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-metrics-certs\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.405116 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-cert\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.416342 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwl2c\" (UniqueName: \"kubernetes.io/projected/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-kube-api-access-jwl2c\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.427436 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6nbt\" (UniqueName: \"kubernetes.io/projected/2d23227d-3456-4a34-9aa7-4878c7ee4d37-kube-api-access-h6nbt\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.495635 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.786279 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.790312 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a3347c5-5075-4d8d-99fb-cd2468efe83d-metrics-certs\") pod \"frr-k8s-82nts\" (UID: \"2a3347c5-5075-4d8d-99fb-cd2468efe83d\") " pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.791136 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.841139 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-b722z"] Nov 25 09:15:27 crc kubenswrapper[4565]: W1125 09:15:27.843987 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23dd1cf1_30ed_4fc1_9b32_70897895e05d.slice/crio-a4562c63648c4d349493313b69303006a89c70d059f107f6608074842c59a274 WatchSource:0}: Error finding container a4562c63648c4d349493313b69303006a89c70d059f107f6608074842c59a274: Status 404 returned error can't find the container with id a4562c63648c4d349493313b69303006a89c70d059f107f6608074842c59a274 Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.886872 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.886950 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.887188 4565 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 09:15:27 crc kubenswrapper[4565]: E1125 09:15:27.887320 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist podName:1d95b8b8-1675-48ae-b497-ff3fcf0bbc42 nodeName:}" failed. No retries permitted until 2025-11-25 09:15:28.887306559 +0000 UTC m=+662.089801697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist") pod "speaker-dr2xf" (UID: "1d95b8b8-1675-48ae-b497-ff3fcf0bbc42") : secret "metallb-memberlist" not found Nov 25 09:15:27 crc kubenswrapper[4565]: I1125 09:15:27.889428 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d23227d-3456-4a34-9aa7-4878c7ee4d37-metrics-certs\") pod \"controller-6c7b4b5f48-gzvgq\" (UID: \"2d23227d-3456-4a34-9aa7-4878c7ee4d37\") " pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.032625 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" event={"ID":"23dd1cf1-30ed-4fc1-9b32-70897895e05d","Type":"ContainerStarted","Data":"a4562c63648c4d349493313b69303006a89c70d059f107f6608074842c59a274"} Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.033543 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"81af8acf345f21bad91b6aa4993d6c226924b901a59cf78aa9e57f56ef233baa"} Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.182035 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.516276 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-gzvgq"] Nov 25 09:15:28 crc kubenswrapper[4565]: W1125 09:15:28.521304 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d23227d_3456_4a34_9aa7_4878c7ee4d37.slice/crio-5abd6276d7b9c45b6150a043e1b64c26bb391e398d444a7b72144f475229b0b1 WatchSource:0}: Error finding container 5abd6276d7b9c45b6150a043e1b64c26bb391e398d444a7b72144f475229b0b1: Status 404 returned error can't find the container with id 5abd6276d7b9c45b6150a043e1b64c26bb391e398d444a7b72144f475229b0b1 Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.895596 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:28 crc kubenswrapper[4565]: I1125 09:15:28.899699 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d95b8b8-1675-48ae-b497-ff3fcf0bbc42-memberlist\") pod \"speaker-dr2xf\" (UID: \"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42\") " pod="metallb-system/speaker-dr2xf" Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.040113 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-gzvgq" event={"ID":"2d23227d-3456-4a34-9aa7-4878c7ee4d37","Type":"ContainerStarted","Data":"f58bbebbb5a799db180e0c01d59519f8a5a98dfa6a9081077d4adb40010adbda"} Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.040157 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-gzvgq" event={"ID":"2d23227d-3456-4a34-9aa7-4878c7ee4d37","Type":"ContainerStarted","Data":"591fb8933f3f63b172ed39cbac192b2aaccf39aec3ea9185f3a55a7bf37e3eb7"} Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.040169 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-gzvgq" event={"ID":"2d23227d-3456-4a34-9aa7-4878c7ee4d37","Type":"ContainerStarted","Data":"5abd6276d7b9c45b6150a043e1b64c26bb391e398d444a7b72144f475229b0b1"} Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.040330 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.059967 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-gzvgq" podStartSLOduration=2.059957115 podStartE2EDuration="2.059957115s" podCreationTimestamp="2025-11-25 09:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:15:29.056084564 +0000 UTC m=+662.258579702" watchObservedRunningTime="2025-11-25 09:15:29.059957115 +0000 UTC m=+662.262452253" Nov 25 09:15:29 crc kubenswrapper[4565]: I1125 09:15:29.065480 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dr2xf" Nov 25 09:15:29 crc kubenswrapper[4565]: W1125 09:15:29.086433 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d95b8b8_1675_48ae_b497_ff3fcf0bbc42.slice/crio-d18cd42584db9cd20ec803f027a91901d097cba014e926036be8374bd944fcf3 WatchSource:0}: Error finding container d18cd42584db9cd20ec803f027a91901d097cba014e926036be8374bd944fcf3: Status 404 returned error can't find the container with id d18cd42584db9cd20ec803f027a91901d097cba014e926036be8374bd944fcf3 Nov 25 09:15:30 crc kubenswrapper[4565]: I1125 09:15:30.078359 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dr2xf" event={"ID":"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42","Type":"ContainerStarted","Data":"5953cd6220bae68afbcce3776c1d95c05426992a7f97b370c97e8de942c3e9af"} Nov 25 09:15:30 crc kubenswrapper[4565]: I1125 09:15:30.079006 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dr2xf" event={"ID":"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42","Type":"ContainerStarted","Data":"2c080629527271f398fd307569b223429a7056f7a48724435c13097cffa39d5f"} Nov 25 09:15:30 crc kubenswrapper[4565]: I1125 09:15:30.079026 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dr2xf" event={"ID":"1d95b8b8-1675-48ae-b497-ff3fcf0bbc42","Type":"ContainerStarted","Data":"d18cd42584db9cd20ec803f027a91901d097cba014e926036be8374bd944fcf3"} Nov 25 09:15:30 crc kubenswrapper[4565]: I1125 09:15:30.079681 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dr2xf" Nov 25 09:15:30 crc kubenswrapper[4565]: I1125 09:15:30.112069 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dr2xf" podStartSLOduration=3.112055681 podStartE2EDuration="3.112055681s" podCreationTimestamp="2025-11-25 09:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:15:30.109401557 +0000 UTC m=+663.311896695" watchObservedRunningTime="2025-11-25 09:15:30.112055681 +0000 UTC m=+663.314550819" Nov 25 09:15:35 crc kubenswrapper[4565]: I1125 09:15:35.111614 4565 generic.go:334] "Generic (PLEG): container finished" podID="2a3347c5-5075-4d8d-99fb-cd2468efe83d" containerID="96653c464570b93490c5f3abfbcda8a9210357523cbbba452a084ab36078970a" exitCode=0 Nov 25 09:15:35 crc kubenswrapper[4565]: I1125 09:15:35.111733 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerDied","Data":"96653c464570b93490c5f3abfbcda8a9210357523cbbba452a084ab36078970a"} Nov 25 09:15:35 crc kubenswrapper[4565]: I1125 09:15:35.114973 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" event={"ID":"23dd1cf1-30ed-4fc1-9b32-70897895e05d","Type":"ContainerStarted","Data":"8b6816cefe49419488d0cfc3fe8c51c09d552b6a3f556e355665441197a8d37f"} Nov 25 09:15:35 crc kubenswrapper[4565]: I1125 09:15:35.115045 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:35 crc kubenswrapper[4565]: I1125 09:15:35.147522 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" podStartSLOduration=1.274136542 podStartE2EDuration="8.147507423s" podCreationTimestamp="2025-11-25 09:15:27 +0000 UTC" firstStartedPulling="2025-11-25 09:15:27.846167748 +0000 UTC m=+661.048662886" lastFinishedPulling="2025-11-25 09:15:34.719538629 +0000 UTC m=+667.922033767" observedRunningTime="2025-11-25 09:15:35.142814184 +0000 UTC m=+668.345309323" watchObservedRunningTime="2025-11-25 09:15:35.147507423 +0000 UTC m=+668.350002561" Nov 25 09:15:36 crc kubenswrapper[4565]: I1125 09:15:36.126201 4565 generic.go:334] "Generic (PLEG): container finished" podID="2a3347c5-5075-4d8d-99fb-cd2468efe83d" containerID="bb3ee9875656608494da3e50a2f04766e1c6949f4e9a83a9dc918c7552eb8def" exitCode=0 Nov 25 09:15:36 crc kubenswrapper[4565]: I1125 09:15:36.126307 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerDied","Data":"bb3ee9875656608494da3e50a2f04766e1c6949f4e9a83a9dc918c7552eb8def"} Nov 25 09:15:37 crc kubenswrapper[4565]: I1125 09:15:37.133211 4565 generic.go:334] "Generic (PLEG): container finished" podID="2a3347c5-5075-4d8d-99fb-cd2468efe83d" containerID="f7208c271a605c4ee10be6a8aa3a76232f5381de32fc34ec03a98508dba50e9d" exitCode=0 Nov 25 09:15:37 crc kubenswrapper[4565]: I1125 09:15:37.133275 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerDied","Data":"f7208c271a605c4ee10be6a8aa3a76232f5381de32fc34ec03a98508dba50e9d"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.144843 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"d19d3f6125319b2504da5c38722c7c8f97d78ae585d7475dde092d66ef32d15c"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146345 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146443 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"220f9c0f654232b7e17af88cc87212cd53d796335d716a20e3cb1139bed70cae"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146502 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"9b98fc69f8205e48785e2a7a4a785f7987b40815f3992ee003b5b77d348af2a1"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146564 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"a3c0d6093ac5d0256f1e4007cbcbe6c8b199c82b980012ddfcbcd9a18420a31f"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146626 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"15988b5a01fd8bfc1c88f41dc47172590ed34123e52b9ed118f46af4d60d4af6"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.146676 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82nts" event={"ID":"2a3347c5-5075-4d8d-99fb-cd2468efe83d","Type":"ContainerStarted","Data":"eb18cd502585e466a2f6672a3c6dc7a21dc6b92d20027b7a54071bfc014b923d"} Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.173115 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-82nts" podStartSLOduration=4.330833592 podStartE2EDuration="11.173099019s" podCreationTimestamp="2025-11-25 09:15:27 +0000 UTC" firstStartedPulling="2025-11-25 09:15:27.868904534 +0000 UTC m=+661.071399672" lastFinishedPulling="2025-11-25 09:15:34.711169961 +0000 UTC m=+667.913665099" observedRunningTime="2025-11-25 09:15:38.171454489 +0000 UTC m=+671.373949628" watchObservedRunningTime="2025-11-25 09:15:38.173099019 +0000 UTC m=+671.375594157" Nov 25 09:15:38 crc kubenswrapper[4565]: I1125 09:15:38.215224 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-gzvgq" Nov 25 09:15:39 crc kubenswrapper[4565]: I1125 09:15:39.068707 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dr2xf" Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.986731 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.987621 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.992258 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.992468 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.992635 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s6c8r" Nov 25 09:15:40 crc kubenswrapper[4565]: I1125 09:15:40.995660 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:41 crc kubenswrapper[4565]: I1125 09:15:41.071739 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvfd\" (UniqueName: \"kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd\") pod \"openstack-operator-index-2lsnl\" (UID: \"4dd52538-303f-4419-b8d7-69b342312d2a\") " pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:41 crc kubenswrapper[4565]: I1125 09:15:41.173266 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvfd\" (UniqueName: \"kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd\") pod \"openstack-operator-index-2lsnl\" (UID: \"4dd52538-303f-4419-b8d7-69b342312d2a\") " pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:41 crc kubenswrapper[4565]: I1125 09:15:41.195551 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvfd\" (UniqueName: \"kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd\") pod \"openstack-operator-index-2lsnl\" (UID: \"4dd52538-303f-4419-b8d7-69b342312d2a\") " pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:41 crc kubenswrapper[4565]: I1125 09:15:41.301104 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:41 crc kubenswrapper[4565]: I1125 09:15:41.653597 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:42 crc kubenswrapper[4565]: I1125 09:15:42.166901 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lsnl" event={"ID":"4dd52538-303f-4419-b8d7-69b342312d2a","Type":"ContainerStarted","Data":"5cc49b7ac698e24c400961e6665182b9c7ee71887e4ef39e2d5a26fb6edb9e1d"} Nov 25 09:15:42 crc kubenswrapper[4565]: I1125 09:15:42.791366 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:42 crc kubenswrapper[4565]: I1125 09:15:42.847033 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:43 crc kubenswrapper[4565]: I1125 09:15:43.175527 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lsnl" event={"ID":"4dd52538-303f-4419-b8d7-69b342312d2a","Type":"ContainerStarted","Data":"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613"} Nov 25 09:15:43 crc kubenswrapper[4565]: I1125 09:15:43.197093 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2lsnl" podStartSLOduration=2.167849564 podStartE2EDuration="3.197075016s" podCreationTimestamp="2025-11-25 09:15:40 +0000 UTC" firstStartedPulling="2025-11-25 09:15:41.654246732 +0000 UTC m=+674.856741869" lastFinishedPulling="2025-11-25 09:15:42.683472183 +0000 UTC m=+675.885967321" observedRunningTime="2025-11-25 09:15:43.193790824 +0000 UTC m=+676.396285962" watchObservedRunningTime="2025-11-25 09:15:43.197075016 +0000 UTC m=+676.399570154" Nov 25 09:15:44 crc kubenswrapper[4565]: I1125 09:15:44.767745 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.186585 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2lsnl" podUID="4dd52538-303f-4419-b8d7-69b342312d2a" containerName="registry-server" containerID="cri-o://a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613" gracePeriod=2 Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.374349 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2hp6w"] Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.375207 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.381602 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2hp6w"] Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.506787 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.523686 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdcwz\" (UniqueName: \"kubernetes.io/projected/4b5856eb-4d4d-406d-bb20-cbc44a10e522-kube-api-access-qdcwz\") pod \"openstack-operator-index-2hp6w\" (UID: \"4b5856eb-4d4d-406d-bb20-cbc44a10e522\") " pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.626674 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvvfd\" (UniqueName: \"kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd\") pod \"4dd52538-303f-4419-b8d7-69b342312d2a\" (UID: \"4dd52538-303f-4419-b8d7-69b342312d2a\") " Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.627290 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdcwz\" (UniqueName: \"kubernetes.io/projected/4b5856eb-4d4d-406d-bb20-cbc44a10e522-kube-api-access-qdcwz\") pod \"openstack-operator-index-2hp6w\" (UID: \"4b5856eb-4d4d-406d-bb20-cbc44a10e522\") " pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.634447 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd" (OuterVolumeSpecName: "kube-api-access-gvvfd") pod "4dd52538-303f-4419-b8d7-69b342312d2a" (UID: "4dd52538-303f-4419-b8d7-69b342312d2a"). InnerVolumeSpecName "kube-api-access-gvvfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.641369 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdcwz\" (UniqueName: \"kubernetes.io/projected/4b5856eb-4d4d-406d-bb20-cbc44a10e522-kube-api-access-qdcwz\") pod \"openstack-operator-index-2hp6w\" (UID: \"4b5856eb-4d4d-406d-bb20-cbc44a10e522\") " pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.695896 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:45 crc kubenswrapper[4565]: I1125 09:15:45.730202 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvvfd\" (UniqueName: \"kubernetes.io/projected/4dd52538-303f-4419-b8d7-69b342312d2a-kube-api-access-gvvfd\") on node \"crc\" DevicePath \"\"" Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.050563 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2hp6w"] Nov 25 09:15:46 crc kubenswrapper[4565]: W1125 09:15:46.054120 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b5856eb_4d4d_406d_bb20_cbc44a10e522.slice/crio-b0a680298aec611b67098fba862eff9d5378ed861212be8f9399ced23a8096f3 WatchSource:0}: Error finding container b0a680298aec611b67098fba862eff9d5378ed861212be8f9399ced23a8096f3: Status 404 returned error can't find the container with id b0a680298aec611b67098fba862eff9d5378ed861212be8f9399ced23a8096f3 Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.192762 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2hp6w" event={"ID":"4b5856eb-4d4d-406d-bb20-cbc44a10e522","Type":"ContainerStarted","Data":"b0a680298aec611b67098fba862eff9d5378ed861212be8f9399ced23a8096f3"} Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.194130 4565 generic.go:334] "Generic (PLEG): container finished" podID="4dd52538-303f-4419-b8d7-69b342312d2a" containerID="a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613" exitCode=0 Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.194182 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lsnl" event={"ID":"4dd52538-303f-4419-b8d7-69b342312d2a","Type":"ContainerDied","Data":"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613"} Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.194202 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lsnl" event={"ID":"4dd52538-303f-4419-b8d7-69b342312d2a","Type":"ContainerDied","Data":"5cc49b7ac698e24c400961e6665182b9c7ee71887e4ef39e2d5a26fb6edb9e1d"} Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.194230 4565 scope.go:117] "RemoveContainer" containerID="a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613" Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.194353 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lsnl" Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.209789 4565 scope.go:117] "RemoveContainer" containerID="a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613" Nov 25 09:15:46 crc kubenswrapper[4565]: E1125 09:15:46.210242 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613\": container with ID starting with a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613 not found: ID does not exist" containerID="a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613" Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.210300 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613"} err="failed to get container status \"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613\": rpc error: code = NotFound desc = could not find container \"a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613\": container with ID starting with a3c3ab8259b14a3d24aa7b2f101874f0e2a1e4befd8d3090d80ff57791c24613 not found: ID does not exist" Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.219570 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:46 crc kubenswrapper[4565]: I1125 09:15:46.229865 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2lsnl"] Nov 25 09:15:47 crc kubenswrapper[4565]: I1125 09:15:47.105901 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd52538-303f-4419-b8d7-69b342312d2a" path="/var/lib/kubelet/pods/4dd52538-303f-4419-b8d7-69b342312d2a/volumes" Nov 25 09:15:47 crc kubenswrapper[4565]: I1125 09:15:47.204291 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2hp6w" event={"ID":"4b5856eb-4d4d-406d-bb20-cbc44a10e522","Type":"ContainerStarted","Data":"3d09cf9a5ab738d5964be62208fe27de82720a2058167562c9cd1ea8ecbdfd0d"} Nov 25 09:15:47 crc kubenswrapper[4565]: I1125 09:15:47.219057 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2hp6w" podStartSLOduration=1.6389058319999998 podStartE2EDuration="2.219042333s" podCreationTimestamp="2025-11-25 09:15:45 +0000 UTC" firstStartedPulling="2025-11-25 09:15:46.057386384 +0000 UTC m=+679.259881522" lastFinishedPulling="2025-11-25 09:15:46.637522885 +0000 UTC m=+679.840018023" observedRunningTime="2025-11-25 09:15:47.218147535 +0000 UTC m=+680.420642663" watchObservedRunningTime="2025-11-25 09:15:47.219042333 +0000 UTC m=+680.421537461" Nov 25 09:15:47 crc kubenswrapper[4565]: I1125 09:15:47.500624 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-b722z" Nov 25 09:15:47 crc kubenswrapper[4565]: I1125 09:15:47.794597 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-82nts" Nov 25 09:15:55 crc kubenswrapper[4565]: I1125 09:15:55.100362 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:15:55 crc kubenswrapper[4565]: I1125 09:15:55.100854 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:15:55 crc kubenswrapper[4565]: I1125 09:15:55.696210 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:55 crc kubenswrapper[4565]: I1125 09:15:55.696602 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:55 crc kubenswrapper[4565]: I1125 09:15:55.727672 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:15:56 crc kubenswrapper[4565]: I1125 09:15:56.273483 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2hp6w" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.199109 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj"] Nov 25 09:16:02 crc kubenswrapper[4565]: E1125 09:16:02.199859 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd52538-303f-4419-b8d7-69b342312d2a" containerName="registry-server" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.199872 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd52538-303f-4419-b8d7-69b342312d2a" containerName="registry-server" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.200001 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd52538-303f-4419-b8d7-69b342312d2a" containerName="registry-server" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.200732 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.202274 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ngf2r" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.208308 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj"] Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.316333 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.316383 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc86s\" (UniqueName: \"kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.316476 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.417425 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.417653 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.417736 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc86s\" (UniqueName: \"kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.417830 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.418631 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.432470 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc86s\" (UniqueName: \"kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.512420 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:02 crc kubenswrapper[4565]: I1125 09:16:02.855179 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj"] Nov 25 09:16:03 crc kubenswrapper[4565]: I1125 09:16:03.281776 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" event={"ID":"62937ebb-ead0-4d96-b186-9dfcc8967ec0","Type":"ContainerDied","Data":"17fccf04bc590b87a810b7eb30b6ad1810771c15303da216ef30a9a516489be7"} Nov 25 09:16:03 crc kubenswrapper[4565]: I1125 09:16:03.282167 4565 generic.go:334] "Generic (PLEG): container finished" podID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerID="17fccf04bc590b87a810b7eb30b6ad1810771c15303da216ef30a9a516489be7" exitCode=0 Nov 25 09:16:03 crc kubenswrapper[4565]: I1125 09:16:03.282196 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" event={"ID":"62937ebb-ead0-4d96-b186-9dfcc8967ec0","Type":"ContainerStarted","Data":"f0a5d16dd2d582f00f0847f6accd7740746f603787b1106043931f05de340710"} Nov 25 09:16:04 crc kubenswrapper[4565]: I1125 09:16:04.289437 4565 generic.go:334] "Generic (PLEG): container finished" podID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerID="ccc268ebd45a8f895f15c26f009ee1fef555b8b58e529bd6d9b2a65b6599faf6" exitCode=0 Nov 25 09:16:04 crc kubenswrapper[4565]: I1125 09:16:04.289490 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" event={"ID":"62937ebb-ead0-4d96-b186-9dfcc8967ec0","Type":"ContainerDied","Data":"ccc268ebd45a8f895f15c26f009ee1fef555b8b58e529bd6d9b2a65b6599faf6"} Nov 25 09:16:05 crc kubenswrapper[4565]: I1125 09:16:05.295873 4565 generic.go:334] "Generic (PLEG): container finished" podID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerID="1f57455349490d09b3ed83fc27ac2e5515ed2849e488918becfa4314dd0b7eaa" exitCode=0 Nov 25 09:16:05 crc kubenswrapper[4565]: I1125 09:16:05.295967 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" event={"ID":"62937ebb-ead0-4d96-b186-9dfcc8967ec0","Type":"ContainerDied","Data":"1f57455349490d09b3ed83fc27ac2e5515ed2849e488918becfa4314dd0b7eaa"} Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.476291 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.573751 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle\") pod \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.573843 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc86s\" (UniqueName: \"kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s\") pod \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.573868 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util\") pod \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\" (UID: \"62937ebb-ead0-4d96-b186-9dfcc8967ec0\") " Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.574407 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle" (OuterVolumeSpecName: "bundle") pod "62937ebb-ead0-4d96-b186-9dfcc8967ec0" (UID: "62937ebb-ead0-4d96-b186-9dfcc8967ec0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.580209 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s" (OuterVolumeSpecName: "kube-api-access-cc86s") pod "62937ebb-ead0-4d96-b186-9dfcc8967ec0" (UID: "62937ebb-ead0-4d96-b186-9dfcc8967ec0"). InnerVolumeSpecName "kube-api-access-cc86s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.584202 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util" (OuterVolumeSpecName: "util") pod "62937ebb-ead0-4d96-b186-9dfcc8967ec0" (UID: "62937ebb-ead0-4d96-b186-9dfcc8967ec0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.675066 4565 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.675191 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc86s\" (UniqueName: \"kubernetes.io/projected/62937ebb-ead0-4d96-b186-9dfcc8967ec0-kube-api-access-cc86s\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:06 crc kubenswrapper[4565]: I1125 09:16:06.675266 4565 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62937ebb-ead0-4d96-b186-9dfcc8967ec0-util\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:07 crc kubenswrapper[4565]: I1125 09:16:07.305219 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" event={"ID":"62937ebb-ead0-4d96-b186-9dfcc8967ec0","Type":"ContainerDied","Data":"f0a5d16dd2d582f00f0847f6accd7740746f603787b1106043931f05de340710"} Nov 25 09:16:07 crc kubenswrapper[4565]: I1125 09:16:07.305250 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj" Nov 25 09:16:07 crc kubenswrapper[4565]: I1125 09:16:07.305257 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0a5d16dd2d582f00f0847f6accd7740746f603787b1106043931f05de340710" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.009721 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s"] Nov 25 09:16:10 crc kubenswrapper[4565]: E1125 09:16:10.010183 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="extract" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.010195 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="extract" Nov 25 09:16:10 crc kubenswrapper[4565]: E1125 09:16:10.010207 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="pull" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.010212 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="pull" Nov 25 09:16:10 crc kubenswrapper[4565]: E1125 09:16:10.010233 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="util" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.010238 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="util" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.010340 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="62937ebb-ead0-4d96-b186-9dfcc8967ec0" containerName="extract" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.010706 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.012199 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fmk8b" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.046269 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s"] Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.118066 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65twv\" (UniqueName: \"kubernetes.io/projected/0c32d371-4207-4e71-8031-a27b6562f9a2-kube-api-access-65twv\") pod \"openstack-operator-controller-operator-7b567956b5-s8c4s\" (UID: \"0c32d371-4207-4e71-8031-a27b6562f9a2\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.220282 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65twv\" (UniqueName: \"kubernetes.io/projected/0c32d371-4207-4e71-8031-a27b6562f9a2-kube-api-access-65twv\") pod \"openstack-operator-controller-operator-7b567956b5-s8c4s\" (UID: \"0c32d371-4207-4e71-8031-a27b6562f9a2\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.244641 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65twv\" (UniqueName: \"kubernetes.io/projected/0c32d371-4207-4e71-8031-a27b6562f9a2-kube-api-access-65twv\") pod \"openstack-operator-controller-operator-7b567956b5-s8c4s\" (UID: \"0c32d371-4207-4e71-8031-a27b6562f9a2\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.323299 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:10 crc kubenswrapper[4565]: I1125 09:16:10.696349 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s"] Nov 25 09:16:11 crc kubenswrapper[4565]: I1125 09:16:11.322839 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" event={"ID":"0c32d371-4207-4e71-8031-a27b6562f9a2","Type":"ContainerStarted","Data":"72d1675f6496cfe5912a8e68da00344347b880dc18a84deb66b41ca5f1b83262"} Nov 25 09:16:15 crc kubenswrapper[4565]: I1125 09:16:15.351261 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" event={"ID":"0c32d371-4207-4e71-8031-a27b6562f9a2","Type":"ContainerStarted","Data":"00b03beda1dbb5ff9b2b8b4e22cfa4e7e5a4452c957cb601e4d11b54b326fd52"} Nov 25 09:16:15 crc kubenswrapper[4565]: I1125 09:16:15.352031 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:15 crc kubenswrapper[4565]: I1125 09:16:15.381166 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" podStartSLOduration=2.186238552 podStartE2EDuration="6.381138294s" podCreationTimestamp="2025-11-25 09:16:09 +0000 UTC" firstStartedPulling="2025-11-25 09:16:10.70086407 +0000 UTC m=+703.903359207" lastFinishedPulling="2025-11-25 09:16:14.895763811 +0000 UTC m=+708.098258949" observedRunningTime="2025-11-25 09:16:15.38112602 +0000 UTC m=+708.583621158" watchObservedRunningTime="2025-11-25 09:16:15.381138294 +0000 UTC m=+708.583633432" Nov 25 09:16:20 crc kubenswrapper[4565]: I1125 09:16:20.326378 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:16:25 crc kubenswrapper[4565]: I1125 09:16:25.099550 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:16:25 crc kubenswrapper[4565]: I1125 09:16:25.100882 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.111127 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.112410 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.115643 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ln4nc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.134850 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.139544 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.140776 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.143447 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zqs2w" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.158161 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.177920 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.179043 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.182650 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q77j7" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.183374 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.186816 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.188071 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-z6jrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.207307 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.209829 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.211039 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.216744 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.217264 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4dlcq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.229280 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.230034 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.238206 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xrp48" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.240232 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfqc\" (UniqueName: \"kubernetes.io/projected/873884b1-6ee8-400c-9ca2-0b0b3c4618e9-kube-api-access-4bfqc\") pod \"barbican-operator-controller-manager-86dc4d89c8-cxwrc\" (UID: \"873884b1-6ee8-400c-9ca2-0b0b3c4618e9\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.240280 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5d8k\" (UniqueName: \"kubernetes.io/projected/93da1f7e-c5e8-4c9c-b6af-feb85c526b47-kube-api-access-r5d8k\") pod \"heat-operator-controller-manager-774b86978c-bd8d6\" (UID: \"93da1f7e-c5e8-4c9c-b6af-feb85c526b47\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.240325 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvpw4\" (UniqueName: \"kubernetes.io/projected/1af57713-55c3-45ec-b98b-1aac75a2d60b-kube-api-access-xvpw4\") pod \"cinder-operator-controller-manager-79856dc55c-ddlth\" (UID: \"1af57713-55c3-45ec-b98b-1aac75a2d60b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.248409 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.256357 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.275246 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.276292 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.284790 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7kwx8" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.288947 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.296344 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.315203 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.327491 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7kkqt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342227 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342289 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfqc\" (UniqueName: \"kubernetes.io/projected/873884b1-6ee8-400c-9ca2-0b0b3c4618e9-kube-api-access-4bfqc\") pod \"barbican-operator-controller-manager-86dc4d89c8-cxwrc\" (UID: \"873884b1-6ee8-400c-9ca2-0b0b3c4618e9\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342335 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5d8k\" (UniqueName: \"kubernetes.io/projected/93da1f7e-c5e8-4c9c-b6af-feb85c526b47-kube-api-access-r5d8k\") pod \"heat-operator-controller-manager-774b86978c-bd8d6\" (UID: \"93da1f7e-c5e8-4c9c-b6af-feb85c526b47\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342369 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxzfn\" (UniqueName: \"kubernetes.io/projected/92be75e0-b60b-4f41-bde1-4f74a4d306e3-kube-api-access-nxzfn\") pod \"glance-operator-controller-manager-68b95954c9-f9bbj\" (UID: \"92be75e0-b60b-4f41-bde1-4f74a4d306e3\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342395 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvpw4\" (UniqueName: \"kubernetes.io/projected/1af57713-55c3-45ec-b98b-1aac75a2d60b-kube-api-access-xvpw4\") pod \"cinder-operator-controller-manager-79856dc55c-ddlth\" (UID: \"1af57713-55c3-45ec-b98b-1aac75a2d60b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342416 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtm6p\" (UniqueName: \"kubernetes.io/projected/333ae034-2972-4915-a547-364c01510827-kube-api-access-gtm6p\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342440 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfxf8\" (UniqueName: \"kubernetes.io/projected/a933a688-5393-4b7b-b0b7-6ee5791970b1-kube-api-access-wfxf8\") pod \"designate-operator-controller-manager-7d695c9b56-t68ww\" (UID: \"a933a688-5393-4b7b-b0b7-6ee5791970b1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.342481 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbfc\" (UniqueName: \"kubernetes.io/projected/354fe5db-35d0-4d94-989c-02a077f8bd20-kube-api-access-5pbfc\") pod \"horizon-operator-controller-manager-68c9694994-2s9lf\" (UID: \"354fe5db-35d0-4d94-989c-02a077f8bd20\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.367574 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfqc\" (UniqueName: \"kubernetes.io/projected/873884b1-6ee8-400c-9ca2-0b0b3c4618e9-kube-api-access-4bfqc\") pod \"barbican-operator-controller-manager-86dc4d89c8-cxwrc\" (UID: \"873884b1-6ee8-400c-9ca2-0b0b3c4618e9\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.387233 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.389495 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5d8k\" (UniqueName: \"kubernetes.io/projected/93da1f7e-c5e8-4c9c-b6af-feb85c526b47-kube-api-access-r5d8k\") pod \"heat-operator-controller-manager-774b86978c-bd8d6\" (UID: \"93da1f7e-c5e8-4c9c-b6af-feb85c526b47\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.421487 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvpw4\" (UniqueName: \"kubernetes.io/projected/1af57713-55c3-45ec-b98b-1aac75a2d60b-kube-api-access-xvpw4\") pod \"cinder-operator-controller-manager-79856dc55c-ddlth\" (UID: \"1af57713-55c3-45ec-b98b-1aac75a2d60b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.423606 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.428419 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444310 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444369 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7d8x\" (UniqueName: \"kubernetes.io/projected/6402fac4-067f-4410-a00c-0d438d502f3c-kube-api-access-c7d8x\") pod \"ironic-operator-controller-manager-5bfcdc958c-mjsqx\" (UID: \"6402fac4-067f-4410-a00c-0d438d502f3c\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444442 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxzfn\" (UniqueName: \"kubernetes.io/projected/92be75e0-b60b-4f41-bde1-4f74a4d306e3-kube-api-access-nxzfn\") pod \"glance-operator-controller-manager-68b95954c9-f9bbj\" (UID: \"92be75e0-b60b-4f41-bde1-4f74a4d306e3\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444474 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtm6p\" (UniqueName: \"kubernetes.io/projected/333ae034-2972-4915-a547-364c01510827-kube-api-access-gtm6p\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444496 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfxf8\" (UniqueName: \"kubernetes.io/projected/a933a688-5393-4b7b-b0b7-6ee5791970b1-kube-api-access-wfxf8\") pod \"designate-operator-controller-manager-7d695c9b56-t68ww\" (UID: \"a933a688-5393-4b7b-b0b7-6ee5791970b1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.444531 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbfc\" (UniqueName: \"kubernetes.io/projected/354fe5db-35d0-4d94-989c-02a077f8bd20-kube-api-access-5pbfc\") pod \"horizon-operator-controller-manager-68c9694994-2s9lf\" (UID: \"354fe5db-35d0-4d94-989c-02a077f8bd20\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:16:37 crc kubenswrapper[4565]: E1125 09:16:37.445081 4565 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 25 09:16:37 crc kubenswrapper[4565]: E1125 09:16:37.445289 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert podName:333ae034-2972-4915-a547-364c01510827 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:37.945231049 +0000 UTC m=+731.147726187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert") pod "infra-operator-controller-manager-d5cc86f4b-2q9rf" (UID: "333ae034-2972-4915-a547-364c01510827") : secret "infra-operator-webhook-server-cert" not found Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.477157 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.477519 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxzfn\" (UniqueName: \"kubernetes.io/projected/92be75e0-b60b-4f41-bde1-4f74a4d306e3-kube-api-access-nxzfn\") pod \"glance-operator-controller-manager-68b95954c9-f9bbj\" (UID: \"92be75e0-b60b-4f41-bde1-4f74a4d306e3\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.477900 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.478293 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.490195 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.495561 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9825s" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.504232 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.504556 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtm6p\" (UniqueName: \"kubernetes.io/projected/333ae034-2972-4915-a547-364c01510827-kube-api-access-gtm6p\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.512682 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.513662 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.514454 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfxf8\" (UniqueName: \"kubernetes.io/projected/a933a688-5393-4b7b-b0b7-6ee5791970b1-kube-api-access-wfxf8\") pod \"designate-operator-controller-manager-7d695c9b56-t68ww\" (UID: \"a933a688-5393-4b7b-b0b7-6ee5791970b1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.519354 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbfc\" (UniqueName: \"kubernetes.io/projected/354fe5db-35d0-4d94-989c-02a077f8bd20-kube-api-access-5pbfc\") pod \"horizon-operator-controller-manager-68c9694994-2s9lf\" (UID: \"354fe5db-35d0-4d94-989c-02a077f8bd20\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.526372 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-g9jgd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.526893 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.537970 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.542143 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.542443 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.543625 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.546653 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-n9qnr" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.547270 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7d8x\" (UniqueName: \"kubernetes.io/projected/6402fac4-067f-4410-a00c-0d438d502f3c-kube-api-access-c7d8x\") pod \"ironic-operator-controller-manager-5bfcdc958c-mjsqx\" (UID: \"6402fac4-067f-4410-a00c-0d438d502f3c\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.547328 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqlpk\" (UniqueName: \"kubernetes.io/projected/cf68120a-e894-4189-8035-91f8045618c0-kube-api-access-cqlpk\") pod \"manila-operator-controller-manager-58bb8d67cc-lz6zt\" (UID: \"cf68120a-e894-4189-8035-91f8045618c0\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.575298 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.589478 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7d8x\" (UniqueName: \"kubernetes.io/projected/6402fac4-067f-4410-a00c-0d438d502f3c-kube-api-access-c7d8x\") pod \"ironic-operator-controller-manager-5bfcdc958c-mjsqx\" (UID: \"6402fac4-067f-4410-a00c-0d438d502f3c\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.613134 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.627093 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.628206 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.635887 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qrbtx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.640597 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.641164 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.653801 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.654750 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.655490 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.656692 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq49m\" (UniqueName: \"kubernetes.io/projected/4ee66804-213d-4e52-b04b-6b00eec8de2d-kube-api-access-xq49m\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-2gkww\" (UID: \"4ee66804-213d-4e52-b04b-6b00eec8de2d\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.656793 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfcd\" (UniqueName: \"kubernetes.io/projected/d5be161b-0f0c-485e-b1c7-50a9fff4b053-kube-api-access-4rfcd\") pod \"keystone-operator-controller-manager-748dc6576f-pcqxq\" (UID: \"d5be161b-0f0c-485e-b1c7-50a9fff4b053\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.656818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqlpk\" (UniqueName: \"kubernetes.io/projected/cf68120a-e894-4189-8035-91f8045618c0-kube-api-access-cqlpk\") pod \"manila-operator-controller-manager-58bb8d67cc-lz6zt\" (UID: \"cf68120a-e894-4189-8035-91f8045618c0\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.657849 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.663662 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8t6bx" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.663800 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-69k9h" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.669678 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.716666 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqlpk\" (UniqueName: \"kubernetes.io/projected/cf68120a-e894-4189-8035-91f8045618c0-kube-api-access-cqlpk\") pod \"manila-operator-controller-manager-58bb8d67cc-lz6zt\" (UID: \"cf68120a-e894-4189-8035-91f8045618c0\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.716974 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.721969 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.722916 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.731644 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.732904 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.737993 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-d9k2v" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.739039 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.739183 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-v7wmt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.748204 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.755286 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.755894 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.757895 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5kj4\" (UniqueName: \"kubernetes.io/projected/6279e5b8-cc23-4b43-9554-754a61174bcd-kube-api-access-z5kj4\") pod \"octavia-operator-controller-manager-fd75fd47d-hrr6t\" (UID: \"6279e5b8-cc23-4b43-9554-754a61174bcd\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.757988 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq49m\" (UniqueName: \"kubernetes.io/projected/4ee66804-213d-4e52-b04b-6b00eec8de2d-kube-api-access-xq49m\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-2gkww\" (UID: \"4ee66804-213d-4e52-b04b-6b00eec8de2d\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.758025 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9pk\" (UniqueName: \"kubernetes.io/projected/f2c67417-c283-4158-91ec-f49478a5378e-kube-api-access-5g9pk\") pod \"nova-operator-controller-manager-79556f57fc-n9bdd\" (UID: \"f2c67417-c283-4158-91ec-f49478a5378e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.758147 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfcd\" (UniqueName: \"kubernetes.io/projected/d5be161b-0f0c-485e-b1c7-50a9fff4b053-kube-api-access-4rfcd\") pod \"keystone-operator-controller-manager-748dc6576f-pcqxq\" (UID: \"d5be161b-0f0c-485e-b1c7-50a9fff4b053\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.758218 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xx8d\" (UniqueName: \"kubernetes.io/projected/d0ef0237-045a-4153-a377-07b2c9e6ceba-kube-api-access-2xx8d\") pod \"neutron-operator-controller-manager-7c57c8bbc4-pzd74\" (UID: \"d0ef0237-045a-4153-a377-07b2c9e6ceba\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.760273 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.767567 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.768569 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.771414 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.771563 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gccb8" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.776813 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.777967 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.777973 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mlncl" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.779708 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hrdhn" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.785981 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.790065 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.793615 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfcd\" (UniqueName: \"kubernetes.io/projected/d5be161b-0f0c-485e-b1c7-50a9fff4b053-kube-api-access-4rfcd\") pod \"keystone-operator-controller-manager-748dc6576f-pcqxq\" (UID: \"d5be161b-0f0c-485e-b1c7-50a9fff4b053\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.803087 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.803119 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7"] Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.803197 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.808760 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xm9gt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.814465 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq49m\" (UniqueName: \"kubernetes.io/projected/4ee66804-213d-4e52-b04b-6b00eec8de2d-kube-api-access-xq49m\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-2gkww\" (UID: \"4ee66804-213d-4e52-b04b-6b00eec8de2d\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.840465 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.872891 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883226 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xx8d\" (UniqueName: \"kubernetes.io/projected/d0ef0237-045a-4153-a377-07b2c9e6ceba-kube-api-access-2xx8d\") pod \"neutron-operator-controller-manager-7c57c8bbc4-pzd74\" (UID: \"d0ef0237-045a-4153-a377-07b2c9e6ceba\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883297 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnt9\" (UniqueName: \"kubernetes.io/projected/1ef630cb-2220-41f5-8a3d-66a2a78ce0ce-kube-api-access-qfnt9\") pod \"telemetry-operator-controller-manager-567f98c9d-7dzx4\" (UID: \"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883336 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvxtt\" (UniqueName: \"kubernetes.io/projected/052c7786-4d54-4af0-8598-91ff09cdf966-kube-api-access-cvxtt\") pod \"ovn-operator-controller-manager-66cf5c67ff-zz6wf\" (UID: \"052c7786-4d54-4af0-8598-91ff09cdf966\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883393 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kj4\" (UniqueName: \"kubernetes.io/projected/6279e5b8-cc23-4b43-9554-754a61174bcd-kube-api-access-z5kj4\") pod \"octavia-operator-controller-manager-fd75fd47d-hrr6t\" (UID: \"6279e5b8-cc23-4b43-9554-754a61174bcd\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883439 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9pk\" (UniqueName: \"kubernetes.io/projected/f2c67417-c283-4158-91ec-f49478a5378e-kube-api-access-5g9pk\") pod \"nova-operator-controller-manager-79556f57fc-n9bdd\" (UID: \"f2c67417-c283-4158-91ec-f49478a5378e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883485 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883509 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzl5w\" (UniqueName: \"kubernetes.io/projected/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-kube-api-access-mzl5w\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883555 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmr8\" (UniqueName: \"kubernetes.io/projected/f35f4446-328e-40d3-96d6-2bc814fb8a96-kube-api-access-wvmr8\") pod \"swift-operator-controller-manager-6fdc4fcf86-zl2jr\" (UID: \"f35f4446-328e-40d3-96d6-2bc814fb8a96\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.883583 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfg8\" (UniqueName: \"kubernetes.io/projected/31dbf471-6fab-4ddd-a384-e4dd5335d5dc-kube-api-access-qgfg8\") pod \"placement-operator-controller-manager-5db546f9d9-kgn59\" (UID: \"31dbf471-6fab-4ddd-a384-e4dd5335d5dc\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.894066 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.918788 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xx8d\" (UniqueName: \"kubernetes.io/projected/d0ef0237-045a-4153-a377-07b2c9e6ceba-kube-api-access-2xx8d\") pod \"neutron-operator-controller-manager-7c57c8bbc4-pzd74\" (UID: \"d0ef0237-045a-4153-a377-07b2c9e6ceba\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.954441 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9pk\" (UniqueName: \"kubernetes.io/projected/f2c67417-c283-4158-91ec-f49478a5378e-kube-api-access-5g9pk\") pod \"nova-operator-controller-manager-79556f57fc-n9bdd\" (UID: \"f2c67417-c283-4158-91ec-f49478a5378e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.969967 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5kj4\" (UniqueName: \"kubernetes.io/projected/6279e5b8-cc23-4b43-9554-754a61174bcd-kube-api-access-z5kj4\") pod \"octavia-operator-controller-manager-fd75fd47d-hrr6t\" (UID: \"6279e5b8-cc23-4b43-9554-754a61174bcd\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.977113 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988184 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988249 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988279 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzl5w\" (UniqueName: \"kubernetes.io/projected/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-kube-api-access-mzl5w\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988477 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmr8\" (UniqueName: \"kubernetes.io/projected/f35f4446-328e-40d3-96d6-2bc814fb8a96-kube-api-access-wvmr8\") pod \"swift-operator-controller-manager-6fdc4fcf86-zl2jr\" (UID: \"f35f4446-328e-40d3-96d6-2bc814fb8a96\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988514 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfg8\" (UniqueName: \"kubernetes.io/projected/31dbf471-6fab-4ddd-a384-e4dd5335d5dc-kube-api-access-qgfg8\") pod \"placement-operator-controller-manager-5db546f9d9-kgn59\" (UID: \"31dbf471-6fab-4ddd-a384-e4dd5335d5dc\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988553 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqf5n\" (UniqueName: \"kubernetes.io/projected/cbdce822-eeeb-448b-9f3b-46fdf9e9b43d-kube-api-access-hqf5n\") pod \"test-operator-controller-manager-5cb74df96-sj4j7\" (UID: \"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988617 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnt9\" (UniqueName: \"kubernetes.io/projected/1ef630cb-2220-41f5-8a3d-66a2a78ce0ce-kube-api-access-qfnt9\") pod \"telemetry-operator-controller-manager-567f98c9d-7dzx4\" (UID: \"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.988661 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvxtt\" (UniqueName: \"kubernetes.io/projected/052c7786-4d54-4af0-8598-91ff09cdf966-kube-api-access-cvxtt\") pod \"ovn-operator-controller-manager-66cf5c67ff-zz6wf\" (UID: \"052c7786-4d54-4af0-8598-91ff09cdf966\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:16:37 crc kubenswrapper[4565]: E1125 09:16:37.989212 4565 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:37 crc kubenswrapper[4565]: E1125 09:16:37.989286 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert podName:d4a03edc-1b0f-4f50-bab7-b2292c453f4d nodeName:}" failed. No retries permitted until 2025-11-25 09:16:38.489270415 +0000 UTC m=+731.691765552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" (UID: "d4a03edc-1b0f-4f50-bab7-b2292c453f4d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.996160 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:16:37 crc kubenswrapper[4565]: I1125 09:16:37.999330 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/333ae034-2972-4915-a547-364c01510827-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-2q9rf\" (UID: \"333ae034-2972-4915-a547-364c01510827\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.021306 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-v2c96"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.022921 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.025743 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-v2c96"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.046785 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfg8\" (UniqueName: \"kubernetes.io/projected/31dbf471-6fab-4ddd-a384-e4dd5335d5dc-kube-api-access-qgfg8\") pod \"placement-operator-controller-manager-5db546f9d9-kgn59\" (UID: \"31dbf471-6fab-4ddd-a384-e4dd5335d5dc\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.051058 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-f7lv9" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.055414 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmr8\" (UniqueName: \"kubernetes.io/projected/f35f4446-328e-40d3-96d6-2bc814fb8a96-kube-api-access-wvmr8\") pod \"swift-operator-controller-manager-6fdc4fcf86-zl2jr\" (UID: \"f35f4446-328e-40d3-96d6-2bc814fb8a96\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.075440 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvxtt\" (UniqueName: \"kubernetes.io/projected/052c7786-4d54-4af0-8598-91ff09cdf966-kube-api-access-cvxtt\") pod \"ovn-operator-controller-manager-66cf5c67ff-zz6wf\" (UID: \"052c7786-4d54-4af0-8598-91ff09cdf966\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.085096 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnt9\" (UniqueName: \"kubernetes.io/projected/1ef630cb-2220-41f5-8a3d-66a2a78ce0ce-kube-api-access-qfnt9\") pod \"telemetry-operator-controller-manager-567f98c9d-7dzx4\" (UID: \"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.087437 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzl5w\" (UniqueName: \"kubernetes.io/projected/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-kube-api-access-mzl5w\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.090872 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqf5n\" (UniqueName: \"kubernetes.io/projected/cbdce822-eeeb-448b-9f3b-46fdf9e9b43d-kube-api-access-hqf5n\") pod \"test-operator-controller-manager-5cb74df96-sj4j7\" (UID: \"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.116575 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.117588 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.148481 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.148671 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r2h4v" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.149910 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.169608 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqf5n\" (UniqueName: \"kubernetes.io/projected/cbdce822-eeeb-448b-9f3b-46fdf9e9b43d-kube-api-access-hqf5n\") pod \"test-operator-controller-manager-5cb74df96-sj4j7\" (UID: \"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.169994 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.173441 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.194982 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zthg\" (UniqueName: \"kubernetes.io/projected/3791b99a-d877-470f-8a8f-56f7b02be997-kube-api-access-8zthg\") pod \"watcher-operator-controller-manager-864885998-v2c96\" (UID: \"3791b99a-d877-470f-8a8f-56f7b02be997\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.207159 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.218072 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.223620 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.228765 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.228952 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.233481 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-d5gvb" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.263868 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.297179 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zthg\" (UniqueName: \"kubernetes.io/projected/3791b99a-d877-470f-8a8f-56f7b02be997-kube-api-access-8zthg\") pod \"watcher-operator-controller-manager-864885998-v2c96\" (UID: \"3791b99a-d877-470f-8a8f-56f7b02be997\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.297530 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.297681 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.297813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h52l\" (UniqueName: \"kubernetes.io/projected/579400cf-d71f-47f4-a98e-b94ccbf4ff72-kube-api-access-4h52l\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.306805 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.337329 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.347472 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zthg\" (UniqueName: \"kubernetes.io/projected/3791b99a-d877-470f-8a8f-56f7b02be997-kube-api-access-8zthg\") pod \"watcher-operator-controller-manager-864885998-v2c96\" (UID: \"3791b99a-d877-470f-8a8f-56f7b02be997\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.370331 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.388225 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.397889 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6"] Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.398757 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h52l\" (UniqueName: \"kubernetes.io/projected/579400cf-d71f-47f4-a98e-b94ccbf4ff72-kube-api-access-4h52l\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.398943 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.399078 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ph9\" (UniqueName: \"kubernetes.io/projected/a65931e1-7a1f-4251-9c4f-996b407dfb03-kube-api-access-x5ph9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4llp\" (UID: \"a65931e1-7a1f-4251-9c4f-996b407dfb03\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.399192 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.399432 4565 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.399548 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs podName:579400cf-d71f-47f4-a98e-b94ccbf4ff72 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:38.899528308 +0000 UTC m=+732.102023446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs") pod "openstack-operator-controller-manager-7cd5954d9-fkc7l" (UID: "579400cf-d71f-47f4-a98e-b94ccbf4ff72") : secret "metrics-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.400043 4565 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.400158 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs podName:579400cf-d71f-47f4-a98e-b94ccbf4ff72 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:38.90014962 +0000 UTC m=+732.102644758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-fkc7l" (UID: "579400cf-d71f-47f4-a98e-b94ccbf4ff72") : secret "webhook-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.461080 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h52l\" (UniqueName: \"kubernetes.io/projected/579400cf-d71f-47f4-a98e-b94ccbf4ff72-kube-api-access-4h52l\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.500522 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.500625 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ph9\" (UniqueName: \"kubernetes.io/projected/a65931e1-7a1f-4251-9c4f-996b407dfb03-kube-api-access-x5ph9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4llp\" (UID: \"a65931e1-7a1f-4251-9c4f-996b407dfb03\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.501047 4565 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.501181 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert podName:d4a03edc-1b0f-4f50-bab7-b2292c453f4d nodeName:}" failed. No retries permitted until 2025-11-25 09:16:39.501158598 +0000 UTC m=+732.703653736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" (UID: "d4a03edc-1b0f-4f50-bab7-b2292c453f4d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.513456 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerStarted","Data":"c7b9550647e67cc0c754ce9a884936fd283988b8da918d6e9d0f57885b261ba7"} Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.536916 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ph9\" (UniqueName: \"kubernetes.io/projected/a65931e1-7a1f-4251-9c4f-996b407dfb03-kube-api-access-x5ph9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4llp\" (UID: \"a65931e1-7a1f-4251-9c4f-996b407dfb03\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.567777 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.906769 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: I1125 09:16:38.907110 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.907423 4565 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.907542 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs podName:579400cf-d71f-47f4-a98e-b94ccbf4ff72 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:39.907461695 +0000 UTC m=+733.109956833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs") pod "openstack-operator-controller-manager-7cd5954d9-fkc7l" (UID: "579400cf-d71f-47f4-a98e-b94ccbf4ff72") : secret "metrics-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.911231 4565 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 09:16:38 crc kubenswrapper[4565]: E1125 09:16:38.911311 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs podName:579400cf-d71f-47f4-a98e-b94ccbf4ff72 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:39.911294712 +0000 UTC m=+733.113789849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-fkc7l" (UID: "579400cf-d71f-47f4-a98e-b94ccbf4ff72") : secret "webhook-server-cert" not found Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.007068 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt"] Nov 25 09:16:39 crc kubenswrapper[4565]: W1125 09:16:39.011537 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf68120a_e894_4189_8035_91f8045618c0.slice/crio-5abde8ec995dd0e98caf48fee92dfbe88735449e726b2e64c902bd1643751ec1 WatchSource:0}: Error finding container 5abde8ec995dd0e98caf48fee92dfbe88735449e726b2e64c902bd1643751ec1: Status 404 returned error can't find the container with id 5abde8ec995dd0e98caf48fee92dfbe88735449e726b2e64c902bd1643751ec1 Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.021600 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.027823 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.051940 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.061287 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.209687 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf"] Nov 25 09:16:39 crc kubenswrapper[4565]: W1125 09:16:39.216485 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354fe5db_35d0_4d94_989c_02a077f8bd20.slice/crio-a6752355ba1f9d43b8f2ca7922b751b8d4769f2cfb85d5c5bd49559f4c11c75f WatchSource:0}: Error finding container a6752355ba1f9d43b8f2ca7922b751b8d4769f2cfb85d5c5bd49559f4c11c75f: Status 404 returned error can't find the container with id a6752355ba1f9d43b8f2ca7922b751b8d4769f2cfb85d5c5bd49559f4c11c75f Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.249186 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.252833 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.265000 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.269810 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.300239 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.324667 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww"] Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.332247 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c6405d94e56b40ef669729216ab4b9c441f34bb280902efa2940038c076b560f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wfxf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.337481 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww"] Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.337792 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wfxf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.339032 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.341240 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xq49m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.341514 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq"] Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.342843 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rfcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.343550 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xq49m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.345325 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.345743 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf"] Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.351153 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rfcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.351277 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtm6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.352362 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.353435 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtm6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.354914 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.437444 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-v2c96"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.443688 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.452499 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59"] Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.452540 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr"] Nov 25 09:16:39 crc kubenswrapper[4565]: W1125 09:16:39.461192 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3791b99a_d877_470f_8a8f_56f7b02be997.slice/crio-1ad4957d958615c6368999704828c7f172b4ba664084f6e2c7e495e531a67880 WatchSource:0}: Error finding container 1ad4957d958615c6368999704828c7f172b4ba664084f6e2c7e495e531a67880: Status 404 returned error can't find the container with id 1ad4957d958615c6368999704828c7f172b4ba664084f6e2c7e495e531a67880 Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.463414 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wvmr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.465755 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wvmr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.467370 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.469268 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgfg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.471337 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgfg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.472079 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zthg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.472488 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp"] Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.472574 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.475111 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zthg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: W1125 09:16:39.476072 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65931e1_7a1f_4251_9c4f_996b407dfb03.slice/crio-f6e283c62fc26061e941237eefdfad675469854af52d96022370e993a23fa310 WatchSource:0}: Error finding container f6e283c62fc26061e941237eefdfad675469854af52d96022370e993a23fa310: Status 404 returned error can't find the container with id f6e283c62fc26061e941237eefdfad675469854af52d96022370e993a23fa310 Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.476194 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.478678 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x5ph9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-s4llp_openstack-operators(a65931e1-7a1f-4251-9c4f-996b407dfb03): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.480630 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.519615 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.519898 4565 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.519972 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert podName:d4a03edc-1b0f-4f50-bab7-b2292c453f4d nodeName:}" failed. No retries permitted until 2025-11-25 09:16:41.519956608 +0000 UTC m=+734.722451746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" (UID: "d4a03edc-1b0f-4f50-bab7-b2292c453f4d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.523187 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerStarted","Data":"f6e283c62fc26061e941237eefdfad675469854af52d96022370e993a23fa310"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.524131 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.525622 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerStarted","Data":"a6752355ba1f9d43b8f2ca7922b751b8d4769f2cfb85d5c5bd49559f4c11c75f"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.526827 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerStarted","Data":"e193b8e3f1d836651218d0a8b2d53c8ec1dbab724b599d6823798e5240641d21"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.528364 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerStarted","Data":"7bf062964211d9588f8fa96259dbd7444ed4473032827ce9ddf10e9679924eb8"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.530395 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c6405d94e56b40ef669729216ab4b9c441f34bb280902efa2940038c076b560f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.530531 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerStarted","Data":"0f1ac2df9b51be6a0e5ab45d7baace93df879ec33fa426fddda03a546b0ca816"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.531758 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerStarted","Data":"6e630e0bcbf604d79e17acdfd340713b8a0da35dab217380a9b8994d0094a1a5"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.536638 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerStarted","Data":"f613867909153d6159ad200d07314d82835c7c7a2e465b36c6fd4cfe0decf325"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.538351 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.541157 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerStarted","Data":"51937fe197f336fb0f01b6eb1f142c519bfbca7dbdb78283c25f990e0f83b85e"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.544181 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerStarted","Data":"3f462fed0a3dbec35cf908603eddf245522af40051be470aa813d62ac2152d78"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.551454 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerStarted","Data":"5abde8ec995dd0e98caf48fee92dfbe88735449e726b2e64c902bd1643751ec1"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.551620 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.552826 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerStarted","Data":"e02c156a8935362f72bc4c3043110dffb9339f770acbaabfde6373d3d5e70244"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.566074 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerStarted","Data":"ecddfbd9d417d62ebb52dda4f37f9cf388d9f73ef10c3a6965eb20e1bedd8933"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.576405 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerStarted","Data":"19fec2712fc55fd53903932028f3fd52c9ea22ddc9951c62a123861cbbb17ef8"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.581160 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.584135 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerStarted","Data":"1ad4957d958615c6368999704828c7f172b4ba664084f6e2c7e495e531a67880"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.587764 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.587881 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" event={"ID":"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d","Type":"ContainerStarted","Data":"d6239a6e3bad44a14c256fc49700815693b21f98cbeeb8d6a709f43d93062658"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.589445 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerStarted","Data":"79e25e6c2f040316b6c7247326c1de42194d4b440ba73dcef3f79f4333208868"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.591626 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.591803 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerStarted","Data":"950bc4da70334d9ec93a3029b6f3825a3b9685d44f5e591a00392e8bb8a39f35"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.597363 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerStarted","Data":"ee227e9fa9fc8943c50e79facd3f5d71f71b5eeb263a4a4400b1b1336cfc63f7"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.602241 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerStarted","Data":"37a4b75135c3ab89e0ecc8bc9f0e9e8fdf1c03a1026eec0d30a02f0e85a81225"} Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.608854 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerStarted","Data":"e96c73574febf0bbc082d9709c1b7282af29cd7132675fe6eace14034cddf7d4"} Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.616524 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.929054 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.929390 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.930351 4565 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 09:16:39 crc kubenswrapper[4565]: E1125 09:16:39.930421 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs podName:579400cf-d71f-47f4-a98e-b94ccbf4ff72 nodeName:}" failed. No retries permitted until 2025-11-25 09:16:41.930404259 +0000 UTC m=+735.132899397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-fkc7l" (UID: "579400cf-d71f-47f4-a98e-b94ccbf4ff72") : secret "webhook-server-cert" not found Nov 25 09:16:39 crc kubenswrapper[4565]: I1125 09:16:39.937024 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.625554 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.625886 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.626310 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.626354 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.626981 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.627034 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.629307 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:16:40 crc kubenswrapper[4565]: E1125 09:16:40.642623 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c6405d94e56b40ef669729216ab4b9c441f34bb280902efa2940038c076b560f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:16:41 crc kubenswrapper[4565]: I1125 09:16:41.594826 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:41 crc kubenswrapper[4565]: I1125 09:16:41.615439 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4a03edc-1b0f-4f50-bab7-b2292c453f4d-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-sw4l6\" (UID: \"d4a03edc-1b0f-4f50-bab7-b2292c453f4d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:41 crc kubenswrapper[4565]: I1125 09:16:41.782004 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:16:42 crc kubenswrapper[4565]: I1125 09:16:42.000111 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:42 crc kubenswrapper[4565]: I1125 09:16:42.016856 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/579400cf-d71f-47f4-a98e-b94ccbf4ff72-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-fkc7l\" (UID: \"579400cf-d71f-47f4-a98e-b94ccbf4ff72\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:42 crc kubenswrapper[4565]: I1125 09:16:42.097644 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.008445 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.009514 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerName="controller-manager" containerID="cri-o://5f5b13af332582ca68525477848651f7ff8b47fe7930eba79537694c51ff1c4d" gracePeriod=30 Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.120493 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.120646 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerName="route-controller-manager" containerID="cri-o://9dcf43ac0130e9930e6f54253538d094ae838abd781150edfe03d64d2df1550d" gracePeriod=30 Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.687781 4565 generic.go:334] "Generic (PLEG): container finished" podID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerID="5f5b13af332582ca68525477848651f7ff8b47fe7930eba79537694c51ff1c4d" exitCode=0 Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.687870 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" event={"ID":"684d871e-b542-4963-be08-7dba0c7b6d6a","Type":"ContainerDied","Data":"5f5b13af332582ca68525477848651f7ff8b47fe7930eba79537694c51ff1c4d"} Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.689230 4565 generic.go:334] "Generic (PLEG): container finished" podID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerID="9dcf43ac0130e9930e6f54253538d094ae838abd781150edfe03d64d2df1550d" exitCode=0 Nov 25 09:16:50 crc kubenswrapper[4565]: I1125 09:16:50.689266 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" event={"ID":"d9d37581-10f7-4e98-81e1-6a17ef4a527a","Type":"ContainerDied","Data":"9dcf43ac0130e9930e6f54253538d094ae838abd781150edfe03d64d2df1550d"} Nov 25 09:16:51 crc kubenswrapper[4565]: E1125 09:16:51.571014 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6" Nov 25 09:16:51 crc kubenswrapper[4565]: E1125 09:16:51.571248 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xx8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:16:52 crc kubenswrapper[4565]: E1125 09:16:52.061576 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d" Nov 25 09:16:52 crc kubenswrapper[4565]: E1125 09:16:52.062219 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hqf5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-sj4j7_openstack-operators(cbdce822-eeeb-448b-9f3b-46fdf9e9b43d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:16:52 crc kubenswrapper[4565]: E1125 09:16:52.499750 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f" Nov 25 09:16:52 crc kubenswrapper[4565]: E1125 09:16:52.500021 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qfnt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.598129 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.655164 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn"] Nov 25 09:16:52 crc kubenswrapper[4565]: E1125 09:16:52.655479 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerName="controller-manager" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.655492 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerName="controller-manager" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.655612 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" containerName="controller-manager" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.656082 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.668881 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn"] Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.745526 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" event={"ID":"684d871e-b542-4963-be08-7dba0c7b6d6a","Type":"ContainerDied","Data":"80454e6aaec800da3bcfe866e71a6bee9fd2a9b6e097c928b6d4f9f5793d28ae"} Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.745746 4565 scope.go:117] "RemoveContainer" containerID="5f5b13af332582ca68525477848651f7ff8b47fe7930eba79537694c51ff1c4d" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.745853 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7lhlj" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.773571 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca\") pod \"684d871e-b542-4963-be08-7dba0c7b6d6a\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.773632 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8mn\" (UniqueName: \"kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn\") pod \"684d871e-b542-4963-be08-7dba0c7b6d6a\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.773663 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert\") pod \"684d871e-b542-4963-be08-7dba0c7b6d6a\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.773724 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config\") pod \"684d871e-b542-4963-be08-7dba0c7b6d6a\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.773785 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles\") pod \"684d871e-b542-4963-be08-7dba0c7b6d6a\" (UID: \"684d871e-b542-4963-be08-7dba0c7b6d6a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774288 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-proxy-ca-bundles\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774346 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d36539-833a-44b7-860b-f4f3db9ad651-serving-cert\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774365 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2hs\" (UniqueName: \"kubernetes.io/projected/b5d36539-833a-44b7-860b-f4f3db9ad651-kube-api-access-7r2hs\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774420 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-client-ca\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774457 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-config\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774771 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "684d871e-b542-4963-be08-7dba0c7b6d6a" (UID: "684d871e-b542-4963-be08-7dba0c7b6d6a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.774895 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca" (OuterVolumeSpecName: "client-ca") pod "684d871e-b542-4963-be08-7dba0c7b6d6a" (UID: "684d871e-b542-4963-be08-7dba0c7b6d6a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.777854 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config" (OuterVolumeSpecName: "config") pod "684d871e-b542-4963-be08-7dba0c7b6d6a" (UID: "684d871e-b542-4963-be08-7dba0c7b6d6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.781766 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn" (OuterVolumeSpecName: "kube-api-access-lp8mn") pod "684d871e-b542-4963-be08-7dba0c7b6d6a" (UID: "684d871e-b542-4963-be08-7dba0c7b6d6a"). InnerVolumeSpecName "kube-api-access-lp8mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.785512 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "684d871e-b542-4963-be08-7dba0c7b6d6a" (UID: "684d871e-b542-4963-be08-7dba0c7b6d6a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.860534 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.876837 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-proxy-ca-bundles\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.879842 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d36539-833a-44b7-860b-f4f3db9ad651-serving-cert\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.879906 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2hs\" (UniqueName: \"kubernetes.io/projected/b5d36539-833a-44b7-860b-f4f3db9ad651-kube-api-access-7r2hs\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880036 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-client-ca\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880162 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-config\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880249 4565 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880264 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp8mn\" (UniqueName: \"kubernetes.io/projected/684d871e-b542-4963-be08-7dba0c7b6d6a-kube-api-access-lp8mn\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880276 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/684d871e-b542-4963-be08-7dba0c7b6d6a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880287 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.880296 4565 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/684d871e-b542-4963-be08-7dba0c7b6d6a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.882172 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-proxy-ca-bundles\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.882216 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-config\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.888998 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d36539-833a-44b7-860b-f4f3db9ad651-serving-cert\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.889938 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d36539-833a-44b7-860b-f4f3db9ad651-client-ca\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.913296 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2hs\" (UniqueName: \"kubernetes.io/projected/b5d36539-833a-44b7-860b-f4f3db9ad651-kube-api-access-7r2hs\") pod \"controller-manager-6fbc8bb446-9j6sn\" (UID: \"b5d36539-833a-44b7-860b-f4f3db9ad651\") " pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.981741 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config\") pod \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.981783 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7v8n\" (UniqueName: \"kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n\") pod \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.981865 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca\") pod \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.981907 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert\") pod \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\" (UID: \"d9d37581-10f7-4e98-81e1-6a17ef4a527a\") " Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.983809 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9d37581-10f7-4e98-81e1-6a17ef4a527a" (UID: "d9d37581-10f7-4e98-81e1-6a17ef4a527a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.983883 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config" (OuterVolumeSpecName: "config") pod "d9d37581-10f7-4e98-81e1-6a17ef4a527a" (UID: "d9d37581-10f7-4e98-81e1-6a17ef4a527a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:16:52 crc kubenswrapper[4565]: I1125 09:16:52.989869 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n" (OuterVolumeSpecName: "kube-api-access-w7v8n") pod "d9d37581-10f7-4e98-81e1-6a17ef4a527a" (UID: "d9d37581-10f7-4e98-81e1-6a17ef4a527a"). InnerVolumeSpecName "kube-api-access-w7v8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.019255 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9d37581-10f7-4e98-81e1-6a17ef4a527a" (UID: "d9d37581-10f7-4e98-81e1-6a17ef4a527a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.085278 4565 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.085572 4565 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d37581-10f7-4e98-81e1-6a17ef4a527a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.085588 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d37581-10f7-4e98-81e1-6a17ef4a527a-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.085598 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7v8n\" (UniqueName: \"kubernetes.io/projected/d9d37581-10f7-4e98-81e1-6a17ef4a527a-kube-api-access-w7v8n\") on node \"crc\" DevicePath \"\"" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.130916 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.131021 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.131035 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.133475 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7lhlj"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.136441 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:16:53 crc kubenswrapper[4565]: W1125 09:16:53.485102 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a03edc_1b0f_4f50_bab7_b2292c453f4d.slice/crio-dbbf55d8dc0c505f83a3eb812ad8c187b123180bb16fe42d61f8dc0543470ca4 WatchSource:0}: Error finding container dbbf55d8dc0c505f83a3eb812ad8c187b123180bb16fe42d61f8dc0543470ca4: Status 404 returned error can't find the container with id dbbf55d8dc0c505f83a3eb812ad8c187b123180bb16fe42d61f8dc0543470ca4 Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.780315 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerStarted","Data":"5304514dba13319a2ba18742f4624c0238b46fe0a13ee331aca3412ae4390ac4"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.790000 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerStarted","Data":"eddbde34bce7f860b148d6cf4d6799a9409966d496b8aef9457d602b38d0841b"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.791905 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerStarted","Data":"4e9d4c6939b486ac03437f55781802f96efce947532f9a9dce882321a0f36c48"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.808505 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerStarted","Data":"263b89793cc5d3a3a3cb480f19bc73a24f38dc269b9fcb0f53c578e6fa2c4509"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.839527 4565 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2v7lr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.839808 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.848654 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" event={"ID":"d9d37581-10f7-4e98-81e1-6a17ef4a527a","Type":"ContainerDied","Data":"cbf6c16f72625e2153c1cd68de024d9dfa0125d12713d23e9dfb67e542bff93b"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.848814 4565 scope.go:117] "RemoveContainer" containerID="9dcf43ac0130e9930e6f54253538d094ae838abd781150edfe03d64d2df1550d" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.849050 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr" Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.870376 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerStarted","Data":"3299c62482ca3385dd947f2c0b11108d7248f063e8aba8c0a6c08fd8f2e2a9fd"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.876027 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerStarted","Data":"166feafff94ed9998cc225abbaa23b51797abc515f2146e5879179a9ba8ad307"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.886208 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerStarted","Data":"1bbe5a65350874311ae08c162c53a476b40076f44e2ffa9eb7261763920bee54"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.897096 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.902589 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2v7lr"] Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.918122 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerStarted","Data":"5b628e980940a13fbfb319b158c9a6d712f3202fc04bb3a1f2f22cc6b3b1c08d"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.926029 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerStarted","Data":"39416707e848ac167d22107d737be21ebd583d251bb7d1c35a8224268cf783a3"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.927107 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" event={"ID":"d4a03edc-1b0f-4f50-bab7-b2292c453f4d","Type":"ContainerStarted","Data":"dbbf55d8dc0c505f83a3eb812ad8c187b123180bb16fe42d61f8dc0543470ca4"} Nov 25 09:16:53 crc kubenswrapper[4565]: I1125 09:16:53.929521 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerStarted","Data":"f2c4442747e0bb833c0acac848a2bc57dddd3b678a164abcae51fb4ac20c914b"} Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.792766 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58"] Nov 25 09:16:54 crc kubenswrapper[4565]: E1125 09:16:54.793074 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerName="route-controller-manager" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.793086 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerName="route-controller-manager" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.793201 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" containerName="route-controller-manager" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.793607 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.796732 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.797346 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.797402 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.797558 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.797364 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.800738 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.804374 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58"] Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.820471 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-client-ca\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.820546 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-config\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.820584 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b35439-2134-427e-89a9-49cb70e3c445-serving-cert\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.820604 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhzl\" (UniqueName: \"kubernetes.io/projected/d7b35439-2134-427e-89a9-49cb70e3c445-kube-api-access-6nhzl\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.922299 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-config\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.922411 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b35439-2134-427e-89a9-49cb70e3c445-serving-cert\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.922443 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhzl\" (UniqueName: \"kubernetes.io/projected/d7b35439-2134-427e-89a9-49cb70e3c445-kube-api-access-6nhzl\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.922511 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-client-ca\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.923492 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-client-ca\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.923575 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b35439-2134-427e-89a9-49cb70e3c445-config\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.929213 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b35439-2134-427e-89a9-49cb70e3c445-serving-cert\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.949978 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" event={"ID":"579400cf-d71f-47f4-a98e-b94ccbf4ff72","Type":"ContainerStarted","Data":"b8a74aaab9074ea94b1ac05f7885f11b953d68342f041fde125dce76796235d6"} Nov 25 09:16:54 crc kubenswrapper[4565]: I1125 09:16:54.950231 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhzl\" (UniqueName: \"kubernetes.io/projected/d7b35439-2134-427e-89a9-49cb70e3c445-kube-api-access-6nhzl\") pod \"route-controller-manager-7689885d55-tbx58\" (UID: \"d7b35439-2134-427e-89a9-49cb70e3c445\") " pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.099601 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.099986 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.106608 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684d871e-b542-4963-be08-7dba0c7b6d6a" path="/var/lib/kubelet/pods/684d871e-b542-4963-be08-7dba0c7b6d6a/volumes" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.107854 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d37581-10f7-4e98-81e1-6a17ef4a527a" path="/var/lib/kubelet/pods/d9d37581-10f7-4e98-81e1-6a17ef4a527a/volumes" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.108389 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.108788 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.108844 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71" gracePeriod=600 Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.119258 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.957220 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71" exitCode=0 Nov 25 09:16:55 crc kubenswrapper[4565]: I1125 09:16:55.957267 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71"} Nov 25 09:16:56 crc kubenswrapper[4565]: I1125 09:16:56.943992 4565 scope.go:117] "RemoveContainer" containerID="c83da26a41463f944ec153ed3943109e4f0b3cdfd67ffe37055c08c437f4c00f" Nov 25 09:16:57 crc kubenswrapper[4565]: I1125 09:16:57.381822 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn"] Nov 25 09:16:58 crc kubenswrapper[4565]: W1125 09:16:58.303028 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d36539_833a_44b7_860b_f4f3db9ad651.slice/crio-ae212f4ab8ab82b9e759ec9db4179d91cb4e655bcf45ce9a1a4f03c62637b1d2 WatchSource:0}: Error finding container ae212f4ab8ab82b9e759ec9db4179d91cb4e655bcf45ce9a1a4f03c62637b1d2: Status 404 returned error can't find the container with id ae212f4ab8ab82b9e759ec9db4179d91cb4e655bcf45ce9a1a4f03c62637b1d2 Nov 25 09:16:58 crc kubenswrapper[4565]: I1125 09:16:58.985914 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" event={"ID":"b5d36539-833a-44b7-860b-f4f3db9ad651","Type":"ContainerStarted","Data":"ae212f4ab8ab82b9e759ec9db4179d91cb4e655bcf45ce9a1a4f03c62637b1d2"} Nov 25 09:17:03 crc kubenswrapper[4565]: I1125 09:17:03.198996 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58"] Nov 25 09:17:04 crc kubenswrapper[4565]: I1125 09:17:04.037485 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" event={"ID":"579400cf-d71f-47f4-a98e-b94ccbf4ff72","Type":"ContainerStarted","Data":"48ad5addee24793315efdcdd8e9716c707249289752040834ebe787d71abeced"} Nov 25 09:17:04 crc kubenswrapper[4565]: I1125 09:17:04.039045 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:17:04 crc kubenswrapper[4565]: I1125 09:17:04.059392 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" podStartSLOduration=27.059369804 podStartE2EDuration="27.059369804s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:17:04.057657385 +0000 UTC m=+757.260152543" watchObservedRunningTime="2025-11-25 09:17:04.059369804 +0000 UTC m=+757.261864942" Nov 25 09:17:04 crc kubenswrapper[4565]: W1125 09:17:04.951801 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b35439_2134_427e_89a9_49cb70e3c445.slice/crio-c2db83f27e074a9e2638a4902194ea208db21f611c56a36b0b9dcc4acba7c1e4 WatchSource:0}: Error finding container c2db83f27e074a9e2638a4902194ea208db21f611c56a36b0b9dcc4acba7c1e4: Status 404 returned error can't find the container with id c2db83f27e074a9e2638a4902194ea208db21f611c56a36b0b9dcc4acba7c1e4 Nov 25 09:17:04 crc kubenswrapper[4565]: E1125 09:17:04.961001 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 09:17:04 crc kubenswrapper[4565]: E1125 09:17:04.961375 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xx8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:17:04 crc kubenswrapper[4565]: E1125 09:17:04.963025 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:17:05 crc kubenswrapper[4565]: I1125 09:17:05.045534 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" event={"ID":"d7b35439-2134-427e-89a9-49cb70e3c445","Type":"ContainerStarted","Data":"c2db83f27e074a9e2638a4902194ea208db21f611c56a36b0b9dcc4acba7c1e4"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.101078 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerStarted","Data":"a12528fcdeb1de8e7dfa385a1d981fd380ed4e4bf30f3a9acc1ccde7d2b796b8"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.123038 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerStarted","Data":"3d591f712a1798a2d153816e51ad1fe9ca49e0b190c7107956a1e69fbe5af018"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.124066 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.143234 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.150405 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerStarted","Data":"bfb51eb21b94d243e59672f347b0a405c61ea1b97a3ebffa2c457ee158f7bea8"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.151156 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.169436 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerStarted","Data":"6deeb8f503d5b42bce726a713cf2e86f526da327d6766a30f4de440fc94cd8c3"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.172308 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.174772 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.184509 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerStarted","Data":"cafda87c4a914bf7e4e5b3b61c210729540c87f72d5956d1d530ff48cab41135"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.207170 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.215681 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" event={"ID":"d7b35439-2134-427e-89a9-49cb70e3c445","Type":"ContainerStarted","Data":"2ef636b546f119bc4055c196a640c291df8c36c07e6487b397d3d5e4fa8f571b"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.216473 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.217475 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podStartSLOduration=3.113364606 podStartE2EDuration="29.217463624s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.057475982 +0000 UTC m=+732.259971119" lastFinishedPulling="2025-11-25 09:17:05.161574999 +0000 UTC m=+758.364070137" observedRunningTime="2025-11-25 09:17:06.21535564 +0000 UTC m=+759.417850768" watchObservedRunningTime="2025-11-25 09:17:06.217463624 +0000 UTC m=+759.419958763" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.252156 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerStarted","Data":"4343e2d1f5e01268a651827a151d27955c9600d9fd68b7ccbdda7e4c2374e3ac"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.257022 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.261053 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" event={"ID":"b5d36539-833a-44b7-860b-f4f3db9ad651","Type":"ContainerStarted","Data":"ae8febf47f8c1bd5ed0cd466677f1f4e83fa8f547b525d47f6d14f76b55864e5"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.262172 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.276227 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerStarted","Data":"7f09210820aabf170f0ed34c6d282ddf5e8dc64da36a077852721535b935977b"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.292751 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podStartSLOduration=3.257119642 podStartE2EDuration="29.292738958s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.278247914 +0000 UTC m=+732.480743052" lastFinishedPulling="2025-11-25 09:17:05.313867229 +0000 UTC m=+758.516362368" observedRunningTime="2025-11-25 09:17:06.275942843 +0000 UTC m=+759.478437981" watchObservedRunningTime="2025-11-25 09:17:06.292738958 +0000 UTC m=+759.495234096" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.301559 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.360002 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerStarted","Data":"f4d9d06ae72225fe26a297e05d2013bf19d707ca1336dba5413d1eba46865385"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.394805 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerStarted","Data":"41a861697ee0014a152c9966148b09cc39165d3506ca0b9e482fccdfc89b459b"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.397034 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerStarted","Data":"9d9e03500c8cfd815f40511263131618cb2a25d961e13106b01b9f44e9a7722a"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.398075 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" event={"ID":"d4a03edc-1b0f-4f50-bab7-b2292c453f4d","Type":"ContainerStarted","Data":"230df4302bf427ac98820ee312d238563af7d8491eff5c4b798ff05a82aa6f3f"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.404103 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerStarted","Data":"1e376211332e007ac9c6b0e3d3f6d9972a045be74d6395f12a7b6c09deac8e26"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.404563 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.412151 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.414728 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.418760 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.438025 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerStarted","Data":"3f10f42bd0e83d79f98b62d27becaf13cbcb90a5fb0308d000f29203a1352281"} Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.438882 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.464148 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.464170 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podStartSLOduration=3.6215488369999997 podStartE2EDuration="29.464155704s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.26606188 +0000 UTC m=+732.468557008" lastFinishedPulling="2025-11-25 09:17:05.108668736 +0000 UTC m=+758.311163875" observedRunningTime="2025-11-25 09:17:06.462313921 +0000 UTC m=+759.664809059" watchObservedRunningTime="2025-11-25 09:17:06.464155704 +0000 UTC m=+759.666650842" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.543431 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podStartSLOduration=3.559934357 podStartE2EDuration="29.543413134s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.331810303 +0000 UTC m=+732.534305441" lastFinishedPulling="2025-11-25 09:17:05.315289089 +0000 UTC m=+758.517784218" observedRunningTime="2025-11-25 09:17:06.529724588 +0000 UTC m=+759.732219725" watchObservedRunningTime="2025-11-25 09:17:06.543413134 +0000 UTC m=+759.745908272" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.647313 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podStartSLOduration=3.045217523 podStartE2EDuration="28.647289959s" podCreationTimestamp="2025-11-25 09:16:38 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.478500723 +0000 UTC m=+732.680995860" lastFinishedPulling="2025-11-25 09:17:05.080573159 +0000 UTC m=+758.283068296" observedRunningTime="2025-11-25 09:17:06.567139414 +0000 UTC m=+759.769634553" watchObservedRunningTime="2025-11-25 09:17:06.647289959 +0000 UTC m=+759.849785098" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.650224 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7689885d55-tbx58" podStartSLOduration=16.650214904 podStartE2EDuration="16.650214904s" podCreationTimestamp="2025-11-25 09:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:17:06.644193672 +0000 UTC m=+759.846688799" watchObservedRunningTime="2025-11-25 09:17:06.650214904 +0000 UTC m=+759.852710042" Nov 25 09:17:06 crc kubenswrapper[4565]: E1125 09:17:06.658771 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.701975 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podStartSLOduration=3.764742294 podStartE2EDuration="29.701948555s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.224342899 +0000 UTC m=+732.426838037" lastFinishedPulling="2025-11-25 09:17:05.161549159 +0000 UTC m=+758.364044298" observedRunningTime="2025-11-25 09:17:06.69535817 +0000 UTC m=+759.897853308" watchObservedRunningTime="2025-11-25 09:17:06.701948555 +0000 UTC m=+759.904443692" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.748415 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fbc8bb446-9j6sn" podStartSLOduration=16.74840077 podStartE2EDuration="16.74840077s" podCreationTimestamp="2025-11-25 09:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:17:06.746367095 +0000 UTC m=+759.948862233" watchObservedRunningTime="2025-11-25 09:17:06.74840077 +0000 UTC m=+759.950895907" Nov 25 09:17:06 crc kubenswrapper[4565]: I1125 09:17:06.781480 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podStartSLOduration=10.411371997 podStartE2EDuration="29.781460926s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.471987564 +0000 UTC m=+732.674482702" lastFinishedPulling="2025-11-25 09:16:58.842076483 +0000 UTC m=+752.044571631" observedRunningTime="2025-11-25 09:17:06.7779036 +0000 UTC m=+759.980398727" watchObservedRunningTime="2025-11-25 09:17:06.781460926 +0000 UTC m=+759.983956064" Nov 25 09:17:07 crc kubenswrapper[4565]: E1125 09:17:07.269787 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" podUID="cbdce822-eeeb-448b-9f3b-46fdf9e9b43d" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.454885 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerStarted","Data":"0c0e775c21acb8ff984d75f54c9c644a73c77d1cead305843a5ba450efb2d2b7"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.456734 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerStarted","Data":"01d418c2b8cc6f3d5c74ff4d3e16dd92a48d27cb94b4ccbea292444f5104ce4f"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.457301 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.463467 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerStarted","Data":"cbf935873c0d9607a972fdd08fa3fb95af8b9288fd0e26734503a0a42488603c"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.463891 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.465149 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.467440 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.476339 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerStarted","Data":"1557b4f1556c75909b4d66f04ac6192db42d1330fe3b127f517b63e90d669fc2"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.476758 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.477600 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podStartSLOduration=4.108159275 podStartE2EDuration="30.477591725s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.067714994 +0000 UTC m=+732.270210132" lastFinishedPulling="2025-11-25 09:17:05.437147444 +0000 UTC m=+758.639642582" observedRunningTime="2025-11-25 09:17:07.476877419 +0000 UTC m=+760.679372556" watchObservedRunningTime="2025-11-25 09:17:07.477591725 +0000 UTC m=+760.680086853" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.484947 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerStarted","Data":"455e94409ceb18677846eca5408b61c04d57eb81134a41142c49e4ccb956cb49"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.495263 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podStartSLOduration=4.130857789 podStartE2EDuration="30.495253743s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.013313043 +0000 UTC m=+732.215808181" lastFinishedPulling="2025-11-25 09:17:05.377708997 +0000 UTC m=+758.580204135" observedRunningTime="2025-11-25 09:17:07.49325305 +0000 UTC m=+760.695748188" watchObservedRunningTime="2025-11-25 09:17:07.495253743 +0000 UTC m=+760.697748880" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.500519 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerStarted","Data":"96ab7e0d1dab36db235dd8e5223a419423367ba478db8f4b4b224391ea1e08b4"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.508650 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerStarted","Data":"16b0975df4d4395cae160c633df4b3e5768fd555d2d86e17fc7f2dd8136d1472"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.510052 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.511912 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerStarted","Data":"d56ffd8234e474ab5e111067bf5d13face57653c714b5ab7aa7d9596493c56e9"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.512356 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.515992 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerStarted","Data":"e4494e69ac6e1a605f522ae37b83eb47cf7a3a89d63a86b0c54c7707717d5c50"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.517314 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.517832 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.523944 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.529191 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerStarted","Data":"895dad4d07f3d700f0a59d5b64a5a5bd17bce3cefe66f623225db32d7f8c6da9"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.529977 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.534555 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podStartSLOduration=4.882138508 podStartE2EDuration="30.534544777s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.332008627 +0000 UTC m=+732.534503764" lastFinishedPulling="2025-11-25 09:17:04.984414895 +0000 UTC m=+758.186910033" observedRunningTime="2025-11-25 09:17:07.530323799 +0000 UTC m=+760.732818937" watchObservedRunningTime="2025-11-25 09:17:07.534544777 +0000 UTC m=+760.737039904" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.535013 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerStarted","Data":"150c7af156eed237cfaef719ff8564eda5e2dff4f5ae7cd85b06ff329ac587d4"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.535456 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.537035 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.537627 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" event={"ID":"d4a03edc-1b0f-4f50-bab7-b2292c453f4d","Type":"ContainerStarted","Data":"f6ea11e787ba4b1edda1ffb75236199cdd2b2f09f6a3b39850f248a50b433ee6"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.537763 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.558870 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerStarted","Data":"258cf11d6213f529a7ffa34eb8c66007294eae40a9111ac08d9595af9dcf27fc"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.590743 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerStarted","Data":"d1f145762134715eb452f591951d6f2572b1caa1f2951ce4fb1162ee744345e8"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.591256 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.599097 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerStarted","Data":"7a290927bb92c13722072402dd170ece8537336a9b2554a5846a01a79d1c02eb"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.599548 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.613589 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podStartSLOduration=4.310896648 podStartE2EDuration="30.613576832s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.072480239 +0000 UTC m=+732.274975377" lastFinishedPulling="2025-11-25 09:17:05.375160422 +0000 UTC m=+758.577655561" observedRunningTime="2025-11-25 09:17:07.609795123 +0000 UTC m=+760.812290261" watchObservedRunningTime="2025-11-25 09:17:07.613576832 +0000 UTC m=+760.816071970" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.625394 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" event={"ID":"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d","Type":"ContainerStarted","Data":"6ec8b754fa21bd05738f07e97e6df2af7a372071cfa3fa4ffe192dfe83fb43e3"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.628515 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podStartSLOduration=6.994588038 podStartE2EDuration="30.628500808s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.463295939 +0000 UTC m=+732.665791076" lastFinishedPulling="2025-11-25 09:17:03.097208708 +0000 UTC m=+756.299703846" observedRunningTime="2025-11-25 09:17:07.627750092 +0000 UTC m=+760.830245230" watchObservedRunningTime="2025-11-25 09:17:07.628500808 +0000 UTC m=+760.830995946" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.661070 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podStartSLOduration=4.557156703 podStartE2EDuration="30.661054429s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.064596907 +0000 UTC m=+732.267092045" lastFinishedPulling="2025-11-25 09:17:05.168494634 +0000 UTC m=+758.370989771" observedRunningTime="2025-11-25 09:17:07.65115602 +0000 UTC m=+760.853651157" watchObservedRunningTime="2025-11-25 09:17:07.661054429 +0000 UTC m=+760.863549568" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.668128 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerStarted","Data":"981a6813b434858f6aee87cf45c2da0c8964a9fce160bb4d4b9f1ddc8370ba5b"} Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.668162 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.730316 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" podStartSLOduration=19.238836203 podStartE2EDuration="30.730304284s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:53.492973907 +0000 UTC m=+746.695469044" lastFinishedPulling="2025-11-25 09:17:04.984441996 +0000 UTC m=+758.186937125" observedRunningTime="2025-11-25 09:17:07.705301427 +0000 UTC m=+760.907796566" watchObservedRunningTime="2025-11-25 09:17:07.730304284 +0000 UTC m=+760.932799422" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.757492 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podStartSLOduration=3.95646942 podStartE2EDuration="30.757474497s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:38.513066598 +0000 UTC m=+731.715561736" lastFinishedPulling="2025-11-25 09:17:05.314071674 +0000 UTC m=+758.516566813" observedRunningTime="2025-11-25 09:17:07.754234418 +0000 UTC m=+760.956729557" watchObservedRunningTime="2025-11-25 09:17:07.757474497 +0000 UTC m=+760.959969635" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.795398 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podStartSLOduration=5.151972466 podStartE2EDuration="30.795381382s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.341087383 +0000 UTC m=+732.543582521" lastFinishedPulling="2025-11-25 09:17:04.984496298 +0000 UTC m=+758.186991437" observedRunningTime="2025-11-25 09:17:07.786653918 +0000 UTC m=+760.989149046" watchObservedRunningTime="2025-11-25 09:17:07.795381382 +0000 UTC m=+760.997876520" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.844026 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podStartSLOduration=5.3041787639999995 podStartE2EDuration="30.844009458s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.468750231 +0000 UTC m=+732.671245369" lastFinishedPulling="2025-11-25 09:17:05.008580924 +0000 UTC m=+758.211076063" observedRunningTime="2025-11-25 09:17:07.840408469 +0000 UTC m=+761.042903607" watchObservedRunningTime="2025-11-25 09:17:07.844009458 +0000 UTC m=+761.046504597" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.879015 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podStartSLOduration=7.129133802 podStartE2EDuration="30.879000225s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.342729788 +0000 UTC m=+732.545224927" lastFinishedPulling="2025-11-25 09:17:03.092596212 +0000 UTC m=+756.295091350" observedRunningTime="2025-11-25 09:17:07.87841947 +0000 UTC m=+761.080914608" watchObservedRunningTime="2025-11-25 09:17:07.879000225 +0000 UTC m=+761.081495363" Nov 25 09:17:07 crc kubenswrapper[4565]: I1125 09:17:07.901861 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podStartSLOduration=5.268629905 podStartE2EDuration="30.901847868s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.351137991 +0000 UTC m=+732.553633129" lastFinishedPulling="2025-11-25 09:17:04.984355954 +0000 UTC m=+758.186851092" observedRunningTime="2025-11-25 09:17:07.899543795 +0000 UTC m=+761.102047099" watchObservedRunningTime="2025-11-25 09:17:07.901847868 +0000 UTC m=+761.104343006" Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.675246 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerStarted","Data":"c258ff8ac7399fce58283640d51f84b4bd8f8831fa5a6a1f461ab97cbe543478"} Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.675573 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.677437 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" event={"ID":"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d","Type":"ContainerStarted","Data":"4ae787c992f422bad5650db2f5c33027fd219d2cd3dc140feb4b9b01f2905d86"} Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.677558 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.679646 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerStarted","Data":"d13e8c74f9cc5d9e944d39736d8837666c1e313997d445f97f52e71d697987b8"} Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.695587 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podStartSLOduration=3.162790598 podStartE2EDuration="31.69557433s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.462839718 +0000 UTC m=+732.665334856" lastFinishedPulling="2025-11-25 09:17:07.99562345 +0000 UTC m=+761.198118588" observedRunningTime="2025-11-25 09:17:08.691548761 +0000 UTC m=+761.894043899" watchObservedRunningTime="2025-11-25 09:17:08.69557433 +0000 UTC m=+761.898069468" Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.712249 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podStartSLOduration=5.172845317 podStartE2EDuration="31.71224039s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.290241415 +0000 UTC m=+732.492736553" lastFinishedPulling="2025-11-25 09:17:05.829636488 +0000 UTC m=+759.032131626" observedRunningTime="2025-11-25 09:17:08.711474476 +0000 UTC m=+761.913969614" watchObservedRunningTime="2025-11-25 09:17:08.71224039 +0000 UTC m=+761.914735528" Nov 25 09:17:08 crc kubenswrapper[4565]: I1125 09:17:08.729876 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" podStartSLOduration=2.861156057 podStartE2EDuration="31.729869906s" podCreationTimestamp="2025-11-25 09:16:37 +0000 UTC" firstStartedPulling="2025-11-25 09:16:39.316140432 +0000 UTC m=+732.518635569" lastFinishedPulling="2025-11-25 09:17:08.184854289 +0000 UTC m=+761.387349418" observedRunningTime="2025-11-25 09:17:08.726297141 +0000 UTC m=+761.928792279" watchObservedRunningTime="2025-11-25 09:17:08.729869906 +0000 UTC m=+761.932365044" Nov 25 09:17:09 crc kubenswrapper[4565]: I1125 09:17:09.687744 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:17:11 crc kubenswrapper[4565]: I1125 09:17:11.790604 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:17:12 crc kubenswrapper[4565]: I1125 09:17:12.106018 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:17:17 crc kubenswrapper[4565]: I1125 09:17:17.529151 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:17:17 crc kubenswrapper[4565]: I1125 09:17:17.876583 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:17:17 crc kubenswrapper[4565]: I1125 09:17:17.896608 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:17:17 crc kubenswrapper[4565]: I1125 09:17:17.980654 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.213009 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.265600 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.309030 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.341448 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.372272 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:17:18 crc kubenswrapper[4565]: I1125 09:17:18.395013 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:17:20 crc kubenswrapper[4565]: I1125 09:17:20.999687 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.001414 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.005621 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.018948 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xzg\" (UniqueName: \"kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.018993 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.019039 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.119831 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.120147 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xzg\" (UniqueName: \"kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.120209 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.120313 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.120509 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.135710 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xzg\" (UniqueName: \"kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg\") pod \"certified-operators-x6hrz\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.318020 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.731250 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:21 crc kubenswrapper[4565]: W1125 09:17:21.734994 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7923e6bd_fee0_4285_a8d4_c369536308a2.slice/crio-379a807432fbc155a49ccbc4572e1f0acf5e3fbe2d5273bc48193b90c376aae1 WatchSource:0}: Error finding container 379a807432fbc155a49ccbc4572e1f0acf5e3fbe2d5273bc48193b90c376aae1: Status 404 returned error can't find the container with id 379a807432fbc155a49ccbc4572e1f0acf5e3fbe2d5273bc48193b90c376aae1 Nov 25 09:17:21 crc kubenswrapper[4565]: I1125 09:17:21.759035 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerStarted","Data":"379a807432fbc155a49ccbc4572e1f0acf5e3fbe2d5273bc48193b90c376aae1"} Nov 25 09:17:22 crc kubenswrapper[4565]: I1125 09:17:22.766462 4565 generic.go:334] "Generic (PLEG): container finished" podID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerID="00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54" exitCode=0 Nov 25 09:17:22 crc kubenswrapper[4565]: I1125 09:17:22.766513 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerDied","Data":"00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54"} Nov 25 09:17:23 crc kubenswrapper[4565]: I1125 09:17:23.773709 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerStarted","Data":"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565"} Nov 25 09:17:24 crc kubenswrapper[4565]: I1125 09:17:24.781008 4565 generic.go:334] "Generic (PLEG): container finished" podID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerID="c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565" exitCode=0 Nov 25 09:17:24 crc kubenswrapper[4565]: I1125 09:17:24.781044 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerDied","Data":"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565"} Nov 25 09:17:25 crc kubenswrapper[4565]: I1125 09:17:25.789315 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerStarted","Data":"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492"} Nov 25 09:17:25 crc kubenswrapper[4565]: I1125 09:17:25.807479 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6hrz" podStartSLOduration=3.254067484 podStartE2EDuration="5.807466288s" podCreationTimestamp="2025-11-25 09:17:20 +0000 UTC" firstStartedPulling="2025-11-25 09:17:22.767615393 +0000 UTC m=+775.970110531" lastFinishedPulling="2025-11-25 09:17:25.321014198 +0000 UTC m=+778.523509335" observedRunningTime="2025-11-25 09:17:25.804000363 +0000 UTC m=+779.006495502" watchObservedRunningTime="2025-11-25 09:17:25.807466288 +0000 UTC m=+779.009961425" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.359459 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.361006 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.373528 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.375943 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kkgwb" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.376051 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.376111 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.379664 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.440813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xr8\" (UniqueName: \"kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.440869 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.456422 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.457530 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.460093 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.471379 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.543169 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.543270 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xr8\" (UniqueName: \"kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.543337 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.543369 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnltl\" (UniqueName: \"kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.543413 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.544183 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.561076 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xr8\" (UniqueName: \"kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8\") pod \"dnsmasq-dns-7bdd77c89-w4clf\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.644306 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.644440 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.644512 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnltl\" (UniqueName: \"kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.645570 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.645576 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.659254 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnltl\" (UniqueName: \"kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl\") pod \"dnsmasq-dns-6584b49599-jmrd7\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.676039 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:30 crc kubenswrapper[4565]: I1125 09:17:30.770305 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.116984 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:31 crc kubenswrapper[4565]: W1125 09:17:31.122542 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a2141ea_42ca_43de_9345_c01631407a93.slice/crio-ee2911e0c9958bf3a4e6216c65dc2705733b287a0bde4ad4a3dc3b11ee4eb35b WatchSource:0}: Error finding container ee2911e0c9958bf3a4e6216c65dc2705733b287a0bde4ad4a3dc3b11ee4eb35b: Status 404 returned error can't find the container with id ee2911e0c9958bf3a4e6216c65dc2705733b287a0bde4ad4a3dc3b11ee4eb35b Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.218253 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.318765 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.318850 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.351099 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.842617 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" event={"ID":"2a2141ea-42ca-43de-9345-c01631407a93","Type":"ContainerStarted","Data":"ee2911e0c9958bf3a4e6216c65dc2705733b287a0bde4ad4a3dc3b11ee4eb35b"} Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.844196 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" event={"ID":"53d3ef3f-d96c-46ad-b34a-bb9503143f25","Type":"ContainerStarted","Data":"f5d86d8486c6b0ae350edebcfe896e94644bc154a236e01c9683f6aa31288307"} Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.880821 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:31 crc kubenswrapper[4565]: I1125 09:17:31.921870 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.489032 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.523769 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.525459 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.540154 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.695095 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmnv\" (UniqueName: \"kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.695324 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.695429 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.797136 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.797181 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.797253 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmnv\" (UniqueName: \"kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.798538 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.798609 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.832652 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.836011 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmnv\" (UniqueName: \"kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv\") pod \"dnsmasq-dns-6d8746976c-gdbcm\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.848792 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.875871 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6hrz" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="registry-server" containerID="cri-o://6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492" gracePeriod=2 Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.877996 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.879184 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:33 crc kubenswrapper[4565]: I1125 09:17:33.910599 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.007861 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmpg\" (UniqueName: \"kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.009704 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.009814 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.111443 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.111496 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.111538 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmpg\" (UniqueName: \"kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.112583 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.113147 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.146808 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmpg\" (UniqueName: \"kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg\") pod \"dnsmasq-dns-6486446b9f-4p45g\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.198000 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.507701 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.684625 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.685820 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.688753 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.688757 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.689151 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.689275 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.689416 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xg26s" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.690429 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.695178 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.695330 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.702218 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.787958 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:17:34 crc kubenswrapper[4565]: W1125 09:17:34.806563 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf98fcb7a_0761_41fb_b312_3d7188057efc.slice/crio-d3d19c31de05c7d559bbfa2fe32d91f9ed29a9c5b9c7de0feb1fdf780b19145d WatchSource:0}: Error finding container d3d19c31de05c7d559bbfa2fe32d91f9ed29a9c5b9c7de0feb1fdf780b19145d: Status 404 returned error can't find the container with id d3d19c31de05c7d559bbfa2fe32d91f9ed29a9c5b9c7de0feb1fdf780b19145d Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.846922 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities\") pod \"7923e6bd-fee0-4285-a8d4-c369536308a2\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.846985 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xzg\" (UniqueName: \"kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg\") pod \"7923e6bd-fee0-4285-a8d4-c369536308a2\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847050 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content\") pod \"7923e6bd-fee0-4285-a8d4-c369536308a2\" (UID: \"7923e6bd-fee0-4285-a8d4-c369536308a2\") " Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847215 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847241 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847263 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847286 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847682 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847742 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847766 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847785 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmk8\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847814 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847848 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.847989 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities" (OuterVolumeSpecName: "utilities") pod "7923e6bd-fee0-4285-a8d4-c369536308a2" (UID: "7923e6bd-fee0-4285-a8d4-c369536308a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.848226 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.848312 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.855066 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg" (OuterVolumeSpecName: "kube-api-access-m4xzg") pod "7923e6bd-fee0-4285-a8d4-c369536308a2" (UID: "7923e6bd-fee0-4285-a8d4-c369536308a2"). InnerVolumeSpecName "kube-api-access-m4xzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.889723 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" event={"ID":"56879c95-6643-4472-993d-41fc2b340dc1","Type":"ContainerStarted","Data":"7bb8a71475725bffcb78f200a8502a663b07b4c5b3113cbf47876d3b31c6a38b"} Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.894452 4565 generic.go:334] "Generic (PLEG): container finished" podID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerID="6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492" exitCode=0 Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.894547 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerDied","Data":"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492"} Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.894631 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6hrz" event={"ID":"7923e6bd-fee0-4285-a8d4-c369536308a2","Type":"ContainerDied","Data":"379a807432fbc155a49ccbc4572e1f0acf5e3fbe2d5273bc48193b90c376aae1"} Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.894559 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6hrz" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.894696 4565 scope.go:117] "RemoveContainer" containerID="6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.899299 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" event={"ID":"f98fcb7a-0761-41fb-b312-3d7188057efc","Type":"ContainerStarted","Data":"d3d19c31de05c7d559bbfa2fe32d91f9ed29a9c5b9c7de0feb1fdf780b19145d"} Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.905629 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7923e6bd-fee0-4285-a8d4-c369536308a2" (UID: "7923e6bd-fee0-4285-a8d4-c369536308a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950186 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950264 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950287 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950310 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950333 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950352 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950379 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950398 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950417 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmk8\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950442 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950465 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950506 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xzg\" (UniqueName: \"kubernetes.io/projected/7923e6bd-fee0-4285-a8d4-c369536308a2-kube-api-access-m4xzg\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950519 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7923e6bd-fee0-4285-a8d4-c369536308a2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950667 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.950996 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.951314 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.951425 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.951623 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.951850 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.957504 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.957941 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.960233 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.961571 4565 scope.go:117] "RemoveContainer" containerID="c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.965684 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.967948 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmk8\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:34 crc kubenswrapper[4565]: I1125 09:17:34.975656 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.023443 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.032409 4565 scope.go:117] "RemoveContainer" containerID="00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.035783 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.036155 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="extract-utilities" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.036176 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="extract-utilities" Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.036221 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="extract-content" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.036228 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="extract-content" Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.036244 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="registry-server" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.036251 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="registry-server" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.036436 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" containerName="registry-server" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.037424 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.043893 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.044125 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.044388 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.044549 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.044752 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ss96w" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.044892 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.053713 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.070381 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.113636 4565 scope.go:117] "RemoveContainer" containerID="6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492" Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.114339 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492\": container with ID starting with 6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492 not found: ID does not exist" containerID="6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.114386 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492"} err="failed to get container status \"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492\": rpc error: code = NotFound desc = could not find container \"6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492\": container with ID starting with 6f899931cbde47503e683b08771fcf7cb463ec2c98b31e8c714f473e39e17492 not found: ID does not exist" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.114424 4565 scope.go:117] "RemoveContainer" containerID="c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565" Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.115885 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565\": container with ID starting with c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565 not found: ID does not exist" containerID="c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.117425 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565"} err="failed to get container status \"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565\": rpc error: code = NotFound desc = could not find container \"c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565\": container with ID starting with c8575c992c075e9cea5f30b479ab8f6f7af35386a0d5e778120b50c5acf2d565 not found: ID does not exist" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.117477 4565 scope.go:117] "RemoveContainer" containerID="00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54" Nov 25 09:17:35 crc kubenswrapper[4565]: E1125 09:17:35.118184 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54\": container with ID starting with 00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54 not found: ID does not exist" containerID="00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.118239 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54"} err="failed to get container status \"00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54\": rpc error: code = NotFound desc = could not find container \"00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54\": container with ID starting with 00657647c320ec8fa95a327ab78b4362d1469c28d9af4a318d15f11e6a14af54 not found: ID does not exist" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155291 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155353 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155459 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155493 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155515 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155567 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155643 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsp4z\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155676 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155792 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.155873 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.242060 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.253867 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6hrz"] Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257227 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257272 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257333 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257364 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257410 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257487 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257512 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257532 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257559 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257581 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsp4z\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.257620 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.266643 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.267017 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.270528 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.271601 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.272473 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.272569 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.272819 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.295435 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.296088 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.310853 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.316808 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsp4z\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.330082 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.369240 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.609376 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:17:35 crc kubenswrapper[4565]: W1125 09:17:35.641194 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cc10ca_7483_447d_a1ed_1566c994efdc.slice/crio-d7072647dc0086c0bc879c7e99d9f23cbe7986e40404f10cc02246d1d0026ddd WatchSource:0}: Error finding container d7072647dc0086c0bc879c7e99d9f23cbe7986e40404f10cc02246d1d0026ddd: Status 404 returned error can't find the container with id d7072647dc0086c0bc879c7e99d9f23cbe7986e40404f10cc02246d1d0026ddd Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.925404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerStarted","Data":"d7072647dc0086c0bc879c7e99d9f23cbe7986e40404f10cc02246d1d0026ddd"} Nov 25 09:17:35 crc kubenswrapper[4565]: I1125 09:17:35.980416 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.852488 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.854653 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.864738 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.864973 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.865225 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.865710 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5wxvg" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.876978 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.877576 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 09:17:36 crc kubenswrapper[4565]: I1125 09:17:36.954971 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerStarted","Data":"58abd32e4b679253016aba0bf57cb8e33dfff347156cf169ff79b1d690c6c4a3"} Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.011749 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.011815 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-config-data-default\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.011875 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80ebb406-0240-4ba6-86f1-177776f19865-config-data-generated\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.011896 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-operator-scripts\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.012544 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.012617 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.012691 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhw5\" (UniqueName: \"kubernetes.io/projected/80ebb406-0240-4ba6-86f1-177776f19865-kube-api-access-8vhw5\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.012779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-kolla-config\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.112766 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7923e6bd-fee0-4285-a8d4-c369536308a2" path="/var/lib/kubelet/pods/7923e6bd-fee0-4285-a8d4-c369536308a2/volumes" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116259 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-config-data-default\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116289 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116328 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80ebb406-0240-4ba6-86f1-177776f19865-config-data-generated\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116348 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-operator-scripts\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116376 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116399 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116424 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhw5\" (UniqueName: \"kubernetes.io/projected/80ebb406-0240-4ba6-86f1-177776f19865-kube-api-access-8vhw5\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.116451 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-kolla-config\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.117171 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-kolla-config\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.117919 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-config-data-default\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.118439 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.118835 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80ebb406-0240-4ba6-86f1-177776f19865-config-data-generated\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.120812 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ebb406-0240-4ba6-86f1-177776f19865-operator-scripts\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.142733 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.145660 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhw5\" (UniqueName: \"kubernetes.io/projected/80ebb406-0240-4ba6-86f1-177776f19865-kube-api-access-8vhw5\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.174853 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ebb406-0240-4ba6-86f1-177776f19865-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.208066 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"80ebb406-0240-4ba6-86f1-177776f19865\") " pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.501239 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.920181 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.921993 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.925043 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.925300 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.926184 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.933606 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cttsd" Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.946698 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 09:17:37 crc kubenswrapper[4565]: I1125 09:17:37.992202 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041749 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af73acca-d67e-47fe-89ff-70f865731045-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041797 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041823 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041866 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wzr6\" (UniqueName: \"kubernetes.io/projected/af73acca-d67e-47fe-89ff-70f865731045-kube-api-access-5wzr6\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041903 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041946 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.041990 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.143678 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.143958 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144004 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144051 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af73acca-d67e-47fe-89ff-70f865731045-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144075 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144090 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144120 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wzr6\" (UniqueName: \"kubernetes.io/projected/af73acca-d67e-47fe-89ff-70f865731045-kube-api-access-5wzr6\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144136 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.144882 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.145678 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af73acca-d67e-47fe-89ff-70f865731045-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.147569 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.147808 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.148329 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af73acca-d67e-47fe-89ff-70f865731045-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.154576 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.194185 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73acca-d67e-47fe-89ff-70f865731045-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.218870 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wzr6\" (UniqueName: \"kubernetes.io/projected/af73acca-d67e-47fe-89ff-70f865731045-kube-api-access-5wzr6\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.241730 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"af73acca-d67e-47fe-89ff-70f865731045\") " pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.261554 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.608294 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.611716 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.616250 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.616426 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-95smf" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.619968 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.622321 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.660715 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.660795 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-config-data\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.660827 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-kolla-config\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.660851 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.660882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sd85\" (UniqueName: \"kubernetes.io/projected/26db83c6-ee58-44da-bcb6-16989b77fba4-kube-api-access-9sd85\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.762213 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-config-data\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.762263 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-kolla-config\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.762290 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.762320 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sd85\" (UniqueName: \"kubernetes.io/projected/26db83c6-ee58-44da-bcb6-16989b77fba4-kube-api-access-9sd85\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.762421 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.763272 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-kolla-config\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.763276 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26db83c6-ee58-44da-bcb6-16989b77fba4-config-data\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.767531 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.774385 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db83c6-ee58-44da-bcb6-16989b77fba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.782520 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sd85\" (UniqueName: \"kubernetes.io/projected/26db83c6-ee58-44da-bcb6-16989b77fba4-kube-api-access-9sd85\") pod \"memcached-0\" (UID: \"26db83c6-ee58-44da-bcb6-16989b77fba4\") " pod="openstack/memcached-0" Nov 25 09:17:38 crc kubenswrapper[4565]: I1125 09:17:38.951651 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.520578 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.522551 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.530005 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.579359 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.579510 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz26x\" (UniqueName: \"kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.579622 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.687437 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.687848 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz26x\" (UniqueName: \"kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.687890 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.688315 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.688546 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.722333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz26x\" (UniqueName: \"kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x\") pod \"community-operators-jzwtn\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:39 crc kubenswrapper[4565]: I1125 09:17:39.854962 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.065642 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.071874 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.082292 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mm6kb" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.149476 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.196884 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kcl\" (UniqueName: \"kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl\") pod \"kube-state-metrics-0\" (UID: \"0fecdc60-114f-4981-9386-9814aab46033\") " pod="openstack/kube-state-metrics-0" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.299876 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kcl\" (UniqueName: \"kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl\") pod \"kube-state-metrics-0\" (UID: \"0fecdc60-114f-4981-9386-9814aab46033\") " pod="openstack/kube-state-metrics-0" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.323603 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kcl\" (UniqueName: \"kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl\") pod \"kube-state-metrics-0\" (UID: \"0fecdc60-114f-4981-9386-9814aab46033\") " pod="openstack/kube-state-metrics-0" Nov 25 09:17:40 crc kubenswrapper[4565]: I1125 09:17:40.408353 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.029748 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.032031 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.032038 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.079011 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"80ebb406-0240-4ba6-86f1-177776f19865","Type":"ContainerStarted","Data":"bf4133afc0ffc2ebd3ca24a1442b112563929d257faa149dfaf40d911d6b5ffc"} Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.099653 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.099708 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.099772 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xcbn\" (UniqueName: \"kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.201539 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xcbn\" (UniqueName: \"kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.201645 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.201684 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.202077 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.202555 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.232139 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xcbn\" (UniqueName: \"kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn\") pod \"redhat-marketplace-6qg8j\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.353679 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.481588 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7k468"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.483532 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.486629 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.487083 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6bz9q" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.487113 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.497884 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7k468"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505575 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505623 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505639 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-log-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505669 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-ovn-controller-tls-certs\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505690 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xbl\" (UniqueName: \"kubernetes.io/projected/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-kube-api-access-85xbl\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505706 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-combined-ca-bundle\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.505728 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-scripts\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.538356 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qhxwx"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.539819 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.558946 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qhxwx"] Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607282 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-log-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607329 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607358 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-lib\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607417 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjtr\" (UniqueName: \"kubernetes.io/projected/702c3b01-501a-42d1-a945-603af0fbd306-kube-api-access-4tjtr\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607445 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-ovn-controller-tls-certs\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607469 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xbl\" (UniqueName: \"kubernetes.io/projected/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-kube-api-access-85xbl\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607487 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-combined-ca-bundle\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607520 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-scripts\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607550 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-run\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607581 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-etc-ovs\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607619 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/702c3b01-501a-42d1-a945-603af0fbd306-scripts\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607692 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.607720 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-log\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.608126 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-log-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.608186 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run-ovn\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.608841 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-var-run\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.611978 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-scripts\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.621303 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xbl\" (UniqueName: \"kubernetes.io/projected/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-kube-api-access-85xbl\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.622439 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-ovn-controller-tls-certs\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.626894 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2fa61-acc9-415b-9e10-0a35b6a3feb7-combined-ca-bundle\") pod \"ovn-controller-7k468\" (UID: \"67e2fa61-acc9-415b-9e10-0a35b6a3feb7\") " pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709444 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-lib\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709513 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjtr\" (UniqueName: \"kubernetes.io/projected/702c3b01-501a-42d1-a945-603af0fbd306-kube-api-access-4tjtr\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709572 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-run\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709599 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-etc-ovs\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709631 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/702c3b01-501a-42d1-a945-603af0fbd306-scripts\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709687 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-lib\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709697 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-log\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709807 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-log\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709860 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-etc-ovs\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.709881 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/702c3b01-501a-42d1-a945-603af0fbd306-var-run\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.712171 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/702c3b01-501a-42d1-a945-603af0fbd306-scripts\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.726502 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjtr\" (UniqueName: \"kubernetes.io/projected/702c3b01-501a-42d1-a945-603af0fbd306-kube-api-access-4tjtr\") pod \"ovn-controller-ovs-qhxwx\" (UID: \"702c3b01-501a-42d1-a945-603af0fbd306\") " pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.803515 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468" Nov 25 09:17:45 crc kubenswrapper[4565]: I1125 09:17:45.853539 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.382585 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.388866 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.389270 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.391145 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.394398 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jxwmh" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.394541 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.394625 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.395660 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.436796 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.436866 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.436894 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7rw\" (UniqueName: \"kubernetes.io/projected/164519a3-6eaf-49ac-bc20-cd1a4b04d594-kube-api-access-jz7rw\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.436991 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.437010 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.437098 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.437126 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.437157 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-config\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538559 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538613 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538642 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7rw\" (UniqueName: \"kubernetes.io/projected/164519a3-6eaf-49ac-bc20-cd1a4b04d594-kube-api-access-jz7rw\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538698 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538715 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538777 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538795 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.538818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-config\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.540918 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-config\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.541442 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.541514 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.542442 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/164519a3-6eaf-49ac-bc20-cd1a4b04d594-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.566260 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7rw\" (UniqueName: \"kubernetes.io/projected/164519a3-6eaf-49ac-bc20-cd1a4b04d594-kube-api-access-jz7rw\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.566463 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.566745 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.574467 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164519a3-6eaf-49ac-bc20-cd1a4b04d594-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.584006 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.585501 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.588822 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.589271 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.589861 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sd5bm" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.590057 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.591232 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"164519a3-6eaf-49ac-bc20-cd1a4b04d594\") " pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.614915 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.717913 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.744704 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.744789 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.744860 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.745091 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.745157 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpj2v\" (UniqueName: \"kubernetes.io/projected/68861e53-c198-4971-baf5-dd1653ef84ad-kube-api-access-hpj2v\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.745317 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.745358 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.745677 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.847694 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.847775 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848089 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848120 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848151 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848178 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848220 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpj2v\" (UniqueName: \"kubernetes.io/projected/68861e53-c198-4971-baf5-dd1653ef84ad-kube-api-access-hpj2v\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848306 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.848750 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.849872 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.851083 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.851893 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68861e53-c198-4971-baf5-dd1653ef84ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.854304 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.856306 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.865070 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpj2v\" (UniqueName: \"kubernetes.io/projected/68861e53-c198-4971-baf5-dd1653ef84ad-kube-api-access-hpj2v\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.866692 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68861e53-c198-4971-baf5-dd1653ef84ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.872760 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"68861e53-c198-4971-baf5-dd1653ef84ad\") " pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:47 crc kubenswrapper[4565]: I1125 09:17:47.939092 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 09:17:49 crc kubenswrapper[4565]: I1125 09:17:49.476713 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 09:17:53 crc kubenswrapper[4565]: W1125 09:17:53.174892 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf73acca_d67e_47fe_89ff_70f865731045.slice/crio-a164fcde91e7092ac8ec0fd85ac133492f077bdeb4e55c8c937a3532ecb2eb0e WatchSource:0}: Error finding container a164fcde91e7092ac8ec0fd85ac133492f077bdeb4e55c8c937a3532ecb2eb0e: Status 404 returned error can't find the container with id a164fcde91e7092ac8ec0fd85ac133492f077bdeb4e55c8c937a3532ecb2eb0e Nov 25 09:17:53 crc kubenswrapper[4565]: I1125 09:17:53.536763 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:17:54 crc kubenswrapper[4565]: I1125 09:17:54.177871 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"af73acca-d67e-47fe-89ff-70f865731045","Type":"ContainerStarted","Data":"a164fcde91e7092ac8ec0fd85ac133492f077bdeb4e55c8c937a3532ecb2eb0e"} Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.743668 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.743815 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnltl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6584b49599-jmrd7_openstack(53d3ef3f-d96c-46ad-b34a-bb9503143f25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.745184 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" podUID="53d3ef3f-d96c-46ad-b34a-bb9503143f25" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.769768 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.769890 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4gmpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6486446b9f-4p45g_openstack(f98fcb7a-0761-41fb-b312-3d7188057efc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.773460 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.782673 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.782879 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t5xr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bdd77c89-w4clf_openstack(2a2141ea-42ca-43de-9345-c01631407a93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.787986 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" podUID="2a2141ea-42ca-43de-9345-c01631407a93" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.825051 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.825324 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngmnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d8746976c-gdbcm_openstack(56879c95-6643-4472-993d-41fc2b340dc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:17:55 crc kubenswrapper[4565]: E1125 09:17:55.827598 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" podUID="56879c95-6643-4472-993d-41fc2b340dc1" Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.204442 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"80ebb406-0240-4ba6-86f1-177776f19865","Type":"ContainerStarted","Data":"1542bf44a94f22695a70c2fea40a3eecffc2a3d043a72958833ec1ac2bb96695"} Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.210302 4565 generic.go:334] "Generic (PLEG): container finished" podID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerID="5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5" exitCode=0 Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.210384 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerDied","Data":"5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5"} Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.210407 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerStarted","Data":"11c87bf7eccaa9f59b37d9c2476608f92a7d6119d1c2341e06989933be89d611"} Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.214649 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"af73acca-d67e-47fe-89ff-70f865731045","Type":"ContainerStarted","Data":"6565b032402e074db90c08871fd1b4afcb503dd614c8cf403368cf6b9f1caa94"} Nov 25 09:17:56 crc kubenswrapper[4565]: E1125 09:17:56.216618 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" podUID="56879c95-6643-4472-993d-41fc2b340dc1" Nov 25 09:17:56 crc kubenswrapper[4565]: E1125 09:17:56.217416 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" Nov 25 09:17:56 crc kubenswrapper[4565]: W1125 09:17:56.426233 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26db83c6_ee58_44da_bcb6_16989b77fba4.slice/crio-5da86a7ecc5c4c0942b738a73b284353faaa54f80db117a15e8959de249ade66 WatchSource:0}: Error finding container 5da86a7ecc5c4c0942b738a73b284353faaa54f80db117a15e8959de249ade66: Status 404 returned error can't find the container with id 5da86a7ecc5c4c0942b738a73b284353faaa54f80db117a15e8959de249ade66 Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.433240 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.519686 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7k468"] Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.522691 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.708164 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qhxwx"] Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.810753 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 09:17:56 crc kubenswrapper[4565]: W1125 09:17:56.817906 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod702c3b01_501a_42d1_a945_603af0fbd306.slice/crio-9732622f982d7d408bbca35b2096caed48efe75f91fc14989523a751df46cb6c WatchSource:0}: Error finding container 9732622f982d7d408bbca35b2096caed48efe75f91fc14989523a751df46cb6c: Status 404 returned error can't find the container with id 9732622f982d7d408bbca35b2096caed48efe75f91fc14989523a751df46cb6c Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.853438 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:17:56 crc kubenswrapper[4565]: W1125 09:17:56.907084 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed31d31_597e_4c84_b8da_77761f891338.slice/crio-bb4a0202440f277cea09693879aee74a93c816e95cfef22b03275b185fec91c2 WatchSource:0}: Error finding container bb4a0202440f277cea09693879aee74a93c816e95cfef22b03275b185fec91c2: Status 404 returned error can't find the container with id bb4a0202440f277cea09693879aee74a93c816e95cfef22b03275b185fec91c2 Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.953886 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:56 crc kubenswrapper[4565]: I1125 09:17:56.964950 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.081515 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnltl\" (UniqueName: \"kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl\") pod \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.081635 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config\") pod \"2a2141ea-42ca-43de-9345-c01631407a93\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.081685 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc\") pod \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.081751 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config\") pod \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\" (UID: \"53d3ef3f-d96c-46ad-b34a-bb9503143f25\") " Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.081780 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xr8\" (UniqueName: \"kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8\") pod \"2a2141ea-42ca-43de-9345-c01631407a93\" (UID: \"2a2141ea-42ca-43de-9345-c01631407a93\") " Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.082653 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config" (OuterVolumeSpecName: "config") pod "2a2141ea-42ca-43de-9345-c01631407a93" (UID: "2a2141ea-42ca-43de-9345-c01631407a93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.082759 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53d3ef3f-d96c-46ad-b34a-bb9503143f25" (UID: "53d3ef3f-d96c-46ad-b34a-bb9503143f25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.083316 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config" (OuterVolumeSpecName: "config") pod "53d3ef3f-d96c-46ad-b34a-bb9503143f25" (UID: "53d3ef3f-d96c-46ad-b34a-bb9503143f25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.088876 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8" (OuterVolumeSpecName: "kube-api-access-t5xr8") pod "2a2141ea-42ca-43de-9345-c01631407a93" (UID: "2a2141ea-42ca-43de-9345-c01631407a93"). InnerVolumeSpecName "kube-api-access-t5xr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.089184 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl" (OuterVolumeSpecName: "kube-api-access-cnltl") pod "53d3ef3f-d96c-46ad-b34a-bb9503143f25" (UID: "53d3ef3f-d96c-46ad-b34a-bb9503143f25"). InnerVolumeSpecName "kube-api-access-cnltl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.186293 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.187206 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5xr8\" (UniqueName: \"kubernetes.io/projected/2a2141ea-42ca-43de-9345-c01631407a93-kube-api-access-t5xr8\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.187344 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnltl\" (UniqueName: \"kubernetes.io/projected/53d3ef3f-d96c-46ad-b34a-bb9503143f25-kube-api-access-cnltl\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.187632 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2141ea-42ca-43de-9345-c01631407a93-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.188457 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d3ef3f-d96c-46ad-b34a-bb9503143f25-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.225242 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7k468" event={"ID":"67e2fa61-acc9-415b-9e10-0a35b6a3feb7","Type":"ContainerStarted","Data":"b2903dc0350b44f0ff2b5f4a1de41407d10ae51adbac66c7e53ff8aeb69ae788"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.227236 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerStarted","Data":"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.242859 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerStarted","Data":"d94652e304634ec33bfb162b4c2b317c7742bd20ae9567f935e47470498b93ac"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.245792 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerStarted","Data":"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.248062 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"68861e53-c198-4971-baf5-dd1653ef84ad","Type":"ContainerStarted","Data":"511ea6a06b4964017ee67d43954c4974dc621e3fc896d33aac0d94a8b6419f5b"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.249139 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"26db83c6-ee58-44da-bcb6-16989b77fba4","Type":"ContainerStarted","Data":"5da86a7ecc5c4c0942b738a73b284353faaa54f80db117a15e8959de249ade66"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.256904 4565 generic.go:334] "Generic (PLEG): container finished" podID="fed31d31-597e-4c84-b8da-77761f891338" containerID="f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79" exitCode=0 Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.257013 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerDied","Data":"f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.257040 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerStarted","Data":"bb4a0202440f277cea09693879aee74a93c816e95cfef22b03275b185fec91c2"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.261091 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0fecdc60-114f-4981-9386-9814aab46033","Type":"ContainerStarted","Data":"283b70f639a67c8298e3cc68d96f59b461e30f842b9f65b52ae89c0002644e24"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.264255 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" event={"ID":"2a2141ea-42ca-43de-9345-c01631407a93","Type":"ContainerDied","Data":"ee2911e0c9958bf3a4e6216c65dc2705733b287a0bde4ad4a3dc3b11ee4eb35b"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.264291 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-w4clf" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.268623 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qhxwx" event={"ID":"702c3b01-501a-42d1-a945-603af0fbd306","Type":"ContainerStarted","Data":"9732622f982d7d408bbca35b2096caed48efe75f91fc14989523a751df46cb6c"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.270749 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" event={"ID":"53d3ef3f-d96c-46ad-b34a-bb9503143f25","Type":"ContainerDied","Data":"f5d86d8486c6b0ae350edebcfe896e94644bc154a236e01c9683f6aa31288307"} Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.270788 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-jmrd7" Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.377944 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.410792 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-w4clf"] Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.455850 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.464531 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-jmrd7"] Nov 25 09:17:57 crc kubenswrapper[4565]: I1125 09:17:57.660809 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 09:17:58 crc kubenswrapper[4565]: I1125 09:17:58.281823 4565 generic.go:334] "Generic (PLEG): container finished" podID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerID="d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf" exitCode=0 Nov 25 09:17:58 crc kubenswrapper[4565]: I1125 09:17:58.281891 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerDied","Data":"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf"} Nov 25 09:17:58 crc kubenswrapper[4565]: I1125 09:17:58.287707 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"164519a3-6eaf-49ac-bc20-cd1a4b04d594","Type":"ContainerStarted","Data":"87290bec88edbf4d1a4f829412f44d8f28856e632b443e740a853d4dbd7747b0"} Nov 25 09:17:59 crc kubenswrapper[4565]: I1125 09:17:59.107185 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2141ea-42ca-43de-9345-c01631407a93" path="/var/lib/kubelet/pods/2a2141ea-42ca-43de-9345-c01631407a93/volumes" Nov 25 09:17:59 crc kubenswrapper[4565]: I1125 09:17:59.108240 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d3ef3f-d96c-46ad-b34a-bb9503143f25" path="/var/lib/kubelet/pods/53d3ef3f-d96c-46ad-b34a-bb9503143f25/volumes" Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.312282 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0fecdc60-114f-4981-9386-9814aab46033","Type":"ContainerStarted","Data":"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.312793 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.314659 4565 generic.go:334] "Generic (PLEG): container finished" podID="80ebb406-0240-4ba6-86f1-177776f19865" containerID="1542bf44a94f22695a70c2fea40a3eecffc2a3d043a72958833ec1ac2bb96695" exitCode=0 Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.314738 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"80ebb406-0240-4ba6-86f1-177776f19865","Type":"ContainerDied","Data":"1542bf44a94f22695a70c2fea40a3eecffc2a3d043a72958833ec1ac2bb96695"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.328343 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerStarted","Data":"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.330922 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"26db83c6-ee58-44da-bcb6-16989b77fba4","Type":"ContainerStarted","Data":"b6b5c35a1c1e4d014467ff9c3b21f91ac95ded7927ad4ddc6b977a50fdc8397a"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.331244 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.333400 4565 generic.go:334] "Generic (PLEG): container finished" podID="fed31d31-597e-4c84-b8da-77761f891338" containerID="b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f" exitCode=0 Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.333443 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerDied","Data":"b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.347541 4565 generic.go:334] "Generic (PLEG): container finished" podID="af73acca-d67e-47fe-89ff-70f865731045" containerID="6565b032402e074db90c08871fd1b4afcb503dd614c8cf403368cf6b9f1caa94" exitCode=0 Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.347589 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"af73acca-d67e-47fe-89ff-70f865731045","Type":"ContainerDied","Data":"6565b032402e074db90c08871fd1b4afcb503dd614c8cf403368cf6b9f1caa94"} Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.359350 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.404237294 podStartE2EDuration="20.359270875s" podCreationTimestamp="2025-11-25 09:17:40 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.608155562 +0000 UTC m=+809.810650700" lastFinishedPulling="2025-11-25 09:17:59.563189142 +0000 UTC m=+812.765684281" observedRunningTime="2025-11-25 09:18:00.335793333 +0000 UTC m=+813.538288472" watchObservedRunningTime="2025-11-25 09:18:00.359270875 +0000 UTC m=+813.561766013" Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.394302 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.289453858 podStartE2EDuration="22.394286147s" podCreationTimestamp="2025-11-25 09:17:38 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.458210637 +0000 UTC m=+809.660705775" lastFinishedPulling="2025-11-25 09:17:59.563042937 +0000 UTC m=+812.765538064" observedRunningTime="2025-11-25 09:18:00.386439474 +0000 UTC m=+813.588934612" watchObservedRunningTime="2025-11-25 09:18:00.394286147 +0000 UTC m=+813.596781285" Nov 25 09:18:00 crc kubenswrapper[4565]: I1125 09:18:00.406548 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jzwtn" podStartSLOduration=18.05485555 podStartE2EDuration="21.406528929s" podCreationTimestamp="2025-11-25 09:17:39 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.211947847 +0000 UTC m=+809.414442975" lastFinishedPulling="2025-11-25 09:17:59.563621217 +0000 UTC m=+812.766116354" observedRunningTime="2025-11-25 09:18:00.400396025 +0000 UTC m=+813.602891164" watchObservedRunningTime="2025-11-25 09:18:00.406528929 +0000 UTC m=+813.609024067" Nov 25 09:18:03 crc kubenswrapper[4565]: I1125 09:18:03.375046 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"af73acca-d67e-47fe-89ff-70f865731045","Type":"ContainerStarted","Data":"6a9ff668c06cd5a69cb310b97b8c392767728858d2cf7bbd80e4678a562009b0"} Nov 25 09:18:03 crc kubenswrapper[4565]: I1125 09:18:03.377170 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"80ebb406-0240-4ba6-86f1-177776f19865","Type":"ContainerStarted","Data":"463dae8b8f13748226499a5046ecb6b5ce960d708664397a8b63e4a37f5b95f2"} Nov 25 09:18:03 crc kubenswrapper[4565]: I1125 09:18:03.395853 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.736718564 podStartE2EDuration="27.395842477s" podCreationTimestamp="2025-11-25 09:17:36 +0000 UTC" firstStartedPulling="2025-11-25 09:17:53.180740309 +0000 UTC m=+806.383235447" lastFinishedPulling="2025-11-25 09:17:55.839864222 +0000 UTC m=+809.042359360" observedRunningTime="2025-11-25 09:18:03.392833033 +0000 UTC m=+816.595328172" watchObservedRunningTime="2025-11-25 09:18:03.395842477 +0000 UTC m=+816.598337614" Nov 25 09:18:03 crc kubenswrapper[4565]: I1125 09:18:03.411394 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.29509603 podStartE2EDuration="28.411386642s" podCreationTimestamp="2025-11-25 09:17:35 +0000 UTC" firstStartedPulling="2025-11-25 09:17:44.691431644 +0000 UTC m=+797.893926783" lastFinishedPulling="2025-11-25 09:17:55.807722257 +0000 UTC m=+809.010217395" observedRunningTime="2025-11-25 09:18:03.410597194 +0000 UTC m=+816.613092332" watchObservedRunningTime="2025-11-25 09:18:03.411386642 +0000 UTC m=+816.613881780" Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.405613 4565 generic.go:334] "Generic (PLEG): container finished" podID="702c3b01-501a-42d1-a945-603af0fbd306" containerID="f0b0b12516f8fb8020ba9fdfc3c610b3cecaa95a3345603471888116d2bbb356" exitCode=0 Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.406372 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qhxwx" event={"ID":"702c3b01-501a-42d1-a945-603af0fbd306","Type":"ContainerDied","Data":"f0b0b12516f8fb8020ba9fdfc3c610b3cecaa95a3345603471888116d2bbb356"} Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.419893 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerStarted","Data":"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f"} Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.422475 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7k468" event={"ID":"67e2fa61-acc9-415b-9e10-0a35b6a3feb7","Type":"ContainerStarted","Data":"163f8865e3c1961909bf150bb075133038d46162aa2949e130811177a6b454ca"} Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.422686 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7k468" Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.434441 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"164519a3-6eaf-49ac-bc20-cd1a4b04d594","Type":"ContainerStarted","Data":"8ab5156389035483cf4018e0f8d765df603847f147016f595ad610776ce5c6e8"} Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.442583 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"68861e53-c198-4971-baf5-dd1653ef84ad","Type":"ContainerStarted","Data":"c3c1d2d8e2020086b772e7043e85e26701bfff098d750d944f91b4683fe7e3ba"} Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.459213 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6qg8j" podStartSLOduration=14.614521558 podStartE2EDuration="21.459194405s" podCreationTimestamp="2025-11-25 09:17:44 +0000 UTC" firstStartedPulling="2025-11-25 09:17:57.413477534 +0000 UTC m=+810.615972673" lastFinishedPulling="2025-11-25 09:18:04.258150382 +0000 UTC m=+817.460645520" observedRunningTime="2025-11-25 09:18:05.453899843 +0000 UTC m=+818.656395002" watchObservedRunningTime="2025-11-25 09:18:05.459194405 +0000 UTC m=+818.661689544" Nov 25 09:18:05 crc kubenswrapper[4565]: I1125 09:18:05.478217 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7k468" podStartSLOduration=12.81945057 podStartE2EDuration="20.478206377s" podCreationTimestamp="2025-11-25 09:17:45 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.600011107 +0000 UTC m=+809.802506236" lastFinishedPulling="2025-11-25 09:18:04.258766905 +0000 UTC m=+817.461262043" observedRunningTime="2025-11-25 09:18:05.473084912 +0000 UTC m=+818.675580050" watchObservedRunningTime="2025-11-25 09:18:05.478206377 +0000 UTC m=+818.680701515" Nov 25 09:18:06 crc kubenswrapper[4565]: I1125 09:18:06.461055 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qhxwx" event={"ID":"702c3b01-501a-42d1-a945-603af0fbd306","Type":"ContainerStarted","Data":"3477fdce1915d4947105b21b393a6dbea2ec49846b084873e4269b23e5ebba52"} Nov 25 09:18:06 crc kubenswrapper[4565]: I1125 09:18:06.461467 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qhxwx" event={"ID":"702c3b01-501a-42d1-a945-603af0fbd306","Type":"ContainerStarted","Data":"823a1180c63dd0d1b396b5db23480d2d5dc56dc8abcc76fa7ced702bb601d227"} Nov 25 09:18:06 crc kubenswrapper[4565]: I1125 09:18:06.483805 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qhxwx" podStartSLOduration=14.046209983 podStartE2EDuration="21.483788834s" podCreationTimestamp="2025-11-25 09:17:45 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.819044534 +0000 UTC m=+810.021539671" lastFinishedPulling="2025-11-25 09:18:04.256623384 +0000 UTC m=+817.459118522" observedRunningTime="2025-11-25 09:18:06.480373545 +0000 UTC m=+819.682868682" watchObservedRunningTime="2025-11-25 09:18:06.483788834 +0000 UTC m=+819.686283972" Nov 25 09:18:06 crc kubenswrapper[4565]: I1125 09:18:06.590235 4565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 09:18:07 crc kubenswrapper[4565]: I1125 09:18:07.469245 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:18:07 crc kubenswrapper[4565]: I1125 09:18:07.469302 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:18:07 crc kubenswrapper[4565]: I1125 09:18:07.501674 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 25 09:18:07 crc kubenswrapper[4565]: I1125 09:18:07.501718 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 25 09:18:08 crc kubenswrapper[4565]: I1125 09:18:08.262163 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 25 09:18:08 crc kubenswrapper[4565]: I1125 09:18:08.262203 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 25 09:18:08 crc kubenswrapper[4565]: I1125 09:18:08.324376 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 25 09:18:08 crc kubenswrapper[4565]: I1125 09:18:08.539579 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 25 09:18:08 crc kubenswrapper[4565]: I1125 09:18:08.952922 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.101648 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.486977 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"164519a3-6eaf-49ac-bc20-cd1a4b04d594","Type":"ContainerStarted","Data":"6bed0c3c528b6765cc8478fdbec2442c2223dcffa26180f74cec68c31374e52e"} Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.489883 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"68861e53-c198-4971-baf5-dd1653ef84ad","Type":"ContainerStarted","Data":"57f192b2c477345a83110bf2985b2c80831e68de35f355d41a22fee47cfa83fb"} Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.506801 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.803466268 podStartE2EDuration="23.506781985s" podCreationTimestamp="2025-11-25 09:17:46 +0000 UTC" firstStartedPulling="2025-11-25 09:17:58.000542876 +0000 UTC m=+811.203038005" lastFinishedPulling="2025-11-25 09:18:08.703858584 +0000 UTC m=+821.906353722" observedRunningTime="2025-11-25 09:18:09.501435255 +0000 UTC m=+822.703930393" watchObservedRunningTime="2025-11-25 09:18:09.506781985 +0000 UTC m=+822.709277123" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.527307 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.731486995000001 podStartE2EDuration="23.527277464s" podCreationTimestamp="2025-11-25 09:17:46 +0000 UTC" firstStartedPulling="2025-11-25 09:17:56.903946536 +0000 UTC m=+810.106441664" lastFinishedPulling="2025-11-25 09:18:08.699736995 +0000 UTC m=+821.902232133" observedRunningTime="2025-11-25 09:18:09.524767633 +0000 UTC m=+822.727262771" watchObservedRunningTime="2025-11-25 09:18:09.527277464 +0000 UTC m=+822.729772603" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.672277 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.771981 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.855814 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.856089 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:09 crc kubenswrapper[4565]: I1125 09:18:09.892238 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:10 crc kubenswrapper[4565]: I1125 09:18:10.413957 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 09:18:10 crc kubenswrapper[4565]: I1125 09:18:10.497324 4565 generic.go:334] "Generic (PLEG): container finished" podID="56879c95-6643-4472-993d-41fc2b340dc1" containerID="8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda" exitCode=0 Nov 25 09:18:10 crc kubenswrapper[4565]: I1125 09:18:10.497426 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" event={"ID":"56879c95-6643-4472-993d-41fc2b340dc1","Type":"ContainerDied","Data":"8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda"} Nov 25 09:18:10 crc kubenswrapper[4565]: I1125 09:18:10.534898 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:10 crc kubenswrapper[4565]: I1125 09:18:10.728609 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.505308 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" event={"ID":"56879c95-6643-4472-993d-41fc2b340dc1","Type":"ContainerStarted","Data":"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406"} Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.505870 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.507765 4565 generic.go:334] "Generic (PLEG): container finished" podID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerID="485c3dd2729d88cb7c4e3e1b2ca74d288f07cf5f068a524be56c5b4999f5a4e6" exitCode=0 Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.507857 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" event={"ID":"f98fcb7a-0761-41fb-b312-3d7188057efc","Type":"ContainerDied","Data":"485c3dd2729d88cb7c4e3e1b2ca74d288f07cf5f068a524be56c5b4999f5a4e6"} Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.525864 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" podStartSLOduration=3.4389677450000002 podStartE2EDuration="38.525836501s" podCreationTimestamp="2025-11-25 09:17:33 +0000 UTC" firstStartedPulling="2025-11-25 09:17:34.521163332 +0000 UTC m=+787.723658471" lastFinishedPulling="2025-11-25 09:18:09.608032088 +0000 UTC m=+822.810527227" observedRunningTime="2025-11-25 09:18:11.517918022 +0000 UTC m=+824.720413150" watchObservedRunningTime="2025-11-25 09:18:11.525836501 +0000 UTC m=+824.728331639" Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.718407 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.745721 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.940287 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 25 09:18:11 crc kubenswrapper[4565]: I1125 09:18:11.967506 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.176672 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ct2r6"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.177848 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.181704 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.189681 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ct2r6"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273102 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273232 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-config\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273258 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovs-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273314 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovn-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273372 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvz9\" (UniqueName: \"kubernetes.io/projected/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-kube-api-access-jbvz9\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.273435 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-combined-ca-bundle\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.345738 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374605 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvz9\" (UniqueName: \"kubernetes.io/projected/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-kube-api-access-jbvz9\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374685 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-combined-ca-bundle\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374825 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374900 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-config\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374919 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovs-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.374973 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovn-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.375542 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovn-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.375599 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-ovs-rundir\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.376071 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-config\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.379703 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-combined-ca-bundle\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.383365 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.404507 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.405872 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.412363 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.430571 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvz9\" (UniqueName: \"kubernetes.io/projected/1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9-kube-api-access-jbvz9\") pod \"ovn-controller-metrics-ct2r6\" (UID: \"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9\") " pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.445350 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.492760 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ct2r6" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.561955 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jzwtn" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="registry-server" containerID="cri-o://61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746" gracePeriod=2 Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.562270 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="dnsmasq-dns" containerID="cri-o://bb8b27d7984a931ec91f84552a7977e84bb71cb712bd40bc7c52e5bd6d34273d" gracePeriod=10 Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.562332 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" event={"ID":"f98fcb7a-0761-41fb-b312-3d7188057efc","Type":"ContainerStarted","Data":"bb8b27d7984a931ec91f84552a7977e84bb71cb712bd40bc7c52e5bd6d34273d"} Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.568188 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.568220 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.568356 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.582468 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.582606 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkg5\" (UniqueName: \"kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.582719 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.582766 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.617307 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.637820 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" podStartSLOduration=-9223371997.216969 podStartE2EDuration="39.637807096s" podCreationTimestamp="2025-11-25 09:17:33 +0000 UTC" firstStartedPulling="2025-11-25 09:17:34.812741323 +0000 UTC m=+788.015236461" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:12.635446035 +0000 UTC m=+825.837941173" watchObservedRunningTime="2025-11-25 09:18:12.637807096 +0000 UTC m=+825.840302234" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.641035 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.661026 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.662363 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.664972 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.684194 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.684272 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.684329 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.684484 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkg5\" (UniqueName: \"kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.685020 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.685594 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.690038 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.692237 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.790303 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.790353 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.790395 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.790441 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.790489 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvlj\" (UniqueName: \"kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.830332 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkg5\" (UniqueName: \"kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5\") pod \"dnsmasq-dns-65c9b8d4f7-fmlwx\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.830469 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.892578 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.892635 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.892713 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.892800 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.893078 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvlj\" (UniqueName: \"kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.895048 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.895648 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.895648 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.896154 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.917110 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvlj\" (UniqueName: \"kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj\") pod \"dnsmasq-dns-5c476d78c5-m6c7g\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:12 crc kubenswrapper[4565]: I1125 09:18:12.977416 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.060521 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.061827 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.065425 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.066818 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.067077 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.067129 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zbng8" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.067077 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.108100 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.206627 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.212438 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.212898 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-config\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.213162 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-scripts\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.213249 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.213295 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l85\" (UniqueName: \"kubernetes.io/projected/abfe2157-f884-4325-8d80-7fa9b90754a9-kube-api-access-v7l85\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.213314 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.259922 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.301071 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ct2r6"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323184 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz26x\" (UniqueName: \"kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x\") pod \"a7063b92-d65c-49a6-bf4e-07c4801f8515\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323292 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content\") pod \"a7063b92-d65c-49a6-bf4e-07c4801f8515\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323516 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities\") pod \"a7063b92-d65c-49a6-bf4e-07c4801f8515\" (UID: \"a7063b92-d65c-49a6-bf4e-07c4801f8515\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323830 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-config\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323896 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-scripts\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323942 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323966 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7l85\" (UniqueName: \"kubernetes.io/projected/abfe2157-f884-4325-8d80-7fa9b90754a9-kube-api-access-v7l85\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.323985 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.324021 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.324057 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.326105 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-scripts\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.326879 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities" (OuterVolumeSpecName: "utilities") pod "a7063b92-d65c-49a6-bf4e-07c4801f8515" (UID: "a7063b92-d65c-49a6-bf4e-07c4801f8515"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.329378 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.332717 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.335641 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.335706 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfe2157-f884-4325-8d80-7fa9b90754a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.336646 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfe2157-f884-4325-8d80-7fa9b90754a9-config\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.341658 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x" (OuterVolumeSpecName: "kube-api-access-qz26x") pod "a7063b92-d65c-49a6-bf4e-07c4801f8515" (UID: "a7063b92-d65c-49a6-bf4e-07c4801f8515"). InnerVolumeSpecName "kube-api-access-qz26x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.344060 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7l85\" (UniqueName: \"kubernetes.io/projected/abfe2157-f884-4325-8d80-7fa9b90754a9-kube-api-access-v7l85\") pod \"ovn-northd-0\" (UID: \"abfe2157-f884-4325-8d80-7fa9b90754a9\") " pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.357607 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:13 crc kubenswrapper[4565]: W1125 09:18:13.371691 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode61ca2e3_33ba_4887_9753_144f603688b9.slice/crio-b7ab00e93d21e428d806dd7f3862f62794b6df8e61ee08a09a751dd7150de0bd WatchSource:0}: Error finding container b7ab00e93d21e428d806dd7f3862f62794b6df8e61ee08a09a751dd7150de0bd: Status 404 returned error can't find the container with id b7ab00e93d21e428d806dd7f3862f62794b6df8e61ee08a09a751dd7150de0bd Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.404193 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.413573 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7063b92-d65c-49a6-bf4e-07c4801f8515" (UID: "a7063b92-d65c-49a6-bf4e-07c4801f8515"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.434729 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz26x\" (UniqueName: \"kubernetes.io/projected/a7063b92-d65c-49a6-bf4e-07c4801f8515-kube-api-access-qz26x\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.437149 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.437165 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7063b92-d65c-49a6-bf4e-07c4801f8515-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.569325 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" event={"ID":"e61ca2e3-33ba-4887-9753-144f603688b9","Type":"ContainerStarted","Data":"b7ab00e93d21e428d806dd7f3862f62794b6df8e61ee08a09a751dd7150de0bd"} Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.572209 4565 generic.go:334] "Generic (PLEG): container finished" podID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerID="bb8b27d7984a931ec91f84552a7977e84bb71cb712bd40bc7c52e5bd6d34273d" exitCode=0 Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.572264 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" event={"ID":"f98fcb7a-0761-41fb-b312-3d7188057efc","Type":"ContainerDied","Data":"bb8b27d7984a931ec91f84552a7977e84bb71cb712bd40bc7c52e5bd6d34273d"} Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.575541 4565 generic.go:334] "Generic (PLEG): container finished" podID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerID="61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746" exitCode=0 Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.575643 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzwtn" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.575722 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerDied","Data":"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746"} Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.575756 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzwtn" event={"ID":"a7063b92-d65c-49a6-bf4e-07c4801f8515","Type":"ContainerDied","Data":"11c87bf7eccaa9f59b37d9c2476608f92a7d6119d1c2341e06989933be89d611"} Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.575772 4565 scope.go:117] "RemoveContainer" containerID="61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.583626 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ct2r6" event={"ID":"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9","Type":"ContainerStarted","Data":"007c043b01a9625b79cc1ceed62dd015cb7a053b3d492d4b3cebc44ac0df4375"} Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.584313 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="dnsmasq-dns" containerID="cri-o://cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406" gracePeriod=10 Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.588055 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.612410 4565 scope.go:117] "RemoveContainer" containerID="d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.630516 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.636218 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jzwtn"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.641816 4565 scope.go:117] "RemoveContainer" containerID="5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.684177 4565 scope.go:117] "RemoveContainer" containerID="61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746" Nov 25 09:18:13 crc kubenswrapper[4565]: E1125 09:18:13.684558 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746\": container with ID starting with 61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746 not found: ID does not exist" containerID="61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.684589 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746"} err="failed to get container status \"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746\": rpc error: code = NotFound desc = could not find container \"61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746\": container with ID starting with 61ca34293785bb969c66b72860c09a880678a6ab113217ae00dcfc5f62502746 not found: ID does not exist" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.684611 4565 scope.go:117] "RemoveContainer" containerID="d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf" Nov 25 09:18:13 crc kubenswrapper[4565]: E1125 09:18:13.685176 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf\": container with ID starting with d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf not found: ID does not exist" containerID="d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.685203 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf"} err="failed to get container status \"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf\": rpc error: code = NotFound desc = could not find container \"d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf\": container with ID starting with d29edf7d165e35cfcbf2df46ec4274f48c46c1a750a91dc8be58f0d98faad7cf not found: ID does not exist" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.685219 4565 scope.go:117] "RemoveContainer" containerID="5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5" Nov 25 09:18:13 crc kubenswrapper[4565]: E1125 09:18:13.685723 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5\": container with ID starting with 5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5 not found: ID does not exist" containerID="5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.685790 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5"} err="failed to get container status \"5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5\": rpc error: code = NotFound desc = could not find container \"5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5\": container with ID starting with 5d4b03297db44571b447843ab1d227d854256912afec01fb1b2bcf873a733ca5 not found: ID does not exist" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.719704 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:13 crc kubenswrapper[4565]: W1125 09:18:13.725961 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14667942_3d50_46be_9011_b33dafbd106e.slice/crio-82921b85fce122742864b118792626f0ceccee1e04b5199c0add1209c216c362 WatchSource:0}: Error finding container 82921b85fce122742864b118792626f0ceccee1e04b5199c0add1209c216c362: Status 404 returned error can't find the container with id 82921b85fce122742864b118792626f0ceccee1e04b5199c0add1209c216c362 Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.741999 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gmpg\" (UniqueName: \"kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg\") pod \"f98fcb7a-0761-41fb-b312-3d7188057efc\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.742289 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config\") pod \"f98fcb7a-0761-41fb-b312-3d7188057efc\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.742383 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc\") pod \"f98fcb7a-0761-41fb-b312-3d7188057efc\" (UID: \"f98fcb7a-0761-41fb-b312-3d7188057efc\") " Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.749122 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg" (OuterVolumeSpecName: "kube-api-access-4gmpg") pod "f98fcb7a-0761-41fb-b312-3d7188057efc" (UID: "f98fcb7a-0761-41fb-b312-3d7188057efc"). InnerVolumeSpecName "kube-api-access-4gmpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.790817 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config" (OuterVolumeSpecName: "config") pod "f98fcb7a-0761-41fb-b312-3d7188057efc" (UID: "f98fcb7a-0761-41fb-b312-3d7188057efc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.803811 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f98fcb7a-0761-41fb-b312-3d7188057efc" (UID: "f98fcb7a-0761-41fb-b312-3d7188057efc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.845009 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.845031 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f98fcb7a-0761-41fb-b312-3d7188057efc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.845041 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gmpg\" (UniqueName: \"kubernetes.io/projected/f98fcb7a-0761-41fb-b312-3d7188057efc-kube-api-access-4gmpg\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.895764 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 09:18:13 crc kubenswrapper[4565]: I1125 09:18:13.979502 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.048796 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngmnv\" (UniqueName: \"kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv\") pod \"56879c95-6643-4472-993d-41fc2b340dc1\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.048983 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc\") pod \"56879c95-6643-4472-993d-41fc2b340dc1\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.049060 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config\") pod \"56879c95-6643-4472-993d-41fc2b340dc1\" (UID: \"56879c95-6643-4472-993d-41fc2b340dc1\") " Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.054909 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv" (OuterVolumeSpecName: "kube-api-access-ngmnv") pod "56879c95-6643-4472-993d-41fc2b340dc1" (UID: "56879c95-6643-4472-993d-41fc2b340dc1"). InnerVolumeSpecName "kube-api-access-ngmnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.079465 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config" (OuterVolumeSpecName: "config") pod "56879c95-6643-4472-993d-41fc2b340dc1" (UID: "56879c95-6643-4472-993d-41fc2b340dc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.082047 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56879c95-6643-4472-993d-41fc2b340dc1" (UID: "56879c95-6643-4472-993d-41fc2b340dc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.151034 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngmnv\" (UniqueName: \"kubernetes.io/projected/56879c95-6643-4472-993d-41fc2b340dc1-kube-api-access-ngmnv\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.151062 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.151072 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56879c95-6643-4472-993d-41fc2b340dc1-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.594404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ct2r6" event={"ID":"1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9","Type":"ContainerStarted","Data":"111f69b03473fc357dc79de0813e9ac57484053f10d03d3a53ea80627c648586"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.596470 4565 generic.go:334] "Generic (PLEG): container finished" podID="e61ca2e3-33ba-4887-9753-144f603688b9" containerID="68384ca66545b8d936cecc7e68c189ced4aff0bcd961517f1e32d583455b4fe9" exitCode=0 Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.596544 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" event={"ID":"e61ca2e3-33ba-4887-9753-144f603688b9","Type":"ContainerDied","Data":"68384ca66545b8d936cecc7e68c189ced4aff0bcd961517f1e32d583455b4fe9"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.598390 4565 generic.go:334] "Generic (PLEG): container finished" podID="56879c95-6643-4472-993d-41fc2b340dc1" containerID="cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406" exitCode=0 Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.598430 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" event={"ID":"56879c95-6643-4472-993d-41fc2b340dc1","Type":"ContainerDied","Data":"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.598442 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.598484 4565 scope.go:117] "RemoveContainer" containerID="cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.598467 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8746976c-gdbcm" event={"ID":"56879c95-6643-4472-993d-41fc2b340dc1","Type":"ContainerDied","Data":"7bb8a71475725bffcb78f200a8502a663b07b4c5b3113cbf47876d3b31c6a38b"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.599775 4565 generic.go:334] "Generic (PLEG): container finished" podID="14667942-3d50-46be-9011-b33dafbd106e" containerID="58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4" exitCode=0 Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.599884 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" event={"ID":"14667942-3d50-46be-9011-b33dafbd106e","Type":"ContainerDied","Data":"58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.599962 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" event={"ID":"14667942-3d50-46be-9011-b33dafbd106e","Type":"ContainerStarted","Data":"82921b85fce122742864b118792626f0ceccee1e04b5199c0add1209c216c362"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.602542 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" event={"ID":"f98fcb7a-0761-41fb-b312-3d7188057efc","Type":"ContainerDied","Data":"d3d19c31de05c7d559bbfa2fe32d91f9ed29a9c5b9c7de0feb1fdf780b19145d"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.602619 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4p45g" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.607464 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abfe2157-f884-4325-8d80-7fa9b90754a9","Type":"ContainerStarted","Data":"b1b1ca9563bedbeac9094ea785c279f40e30933e55ae9ef929a7ca384e5a9b94"} Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.619239 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ct2r6" podStartSLOduration=2.619214657 podStartE2EDuration="2.619214657s" podCreationTimestamp="2025-11-25 09:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:14.61835144 +0000 UTC m=+827.820846577" watchObservedRunningTime="2025-11-25 09:18:14.619214657 +0000 UTC m=+827.821709795" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.651689 4565 scope.go:117] "RemoveContainer" containerID="8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.746801 4565 scope.go:117] "RemoveContainer" containerID="cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406" Nov 25 09:18:14 crc kubenswrapper[4565]: E1125 09:18:14.747338 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406\": container with ID starting with cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406 not found: ID does not exist" containerID="cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.747389 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406"} err="failed to get container status \"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406\": rpc error: code = NotFound desc = could not find container \"cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406\": container with ID starting with cb3d81a45e621bdc5df9f1ccc97d87f3d819f3364a80c9f3c2fd24994dcfd406 not found: ID does not exist" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.747433 4565 scope.go:117] "RemoveContainer" containerID="8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda" Nov 25 09:18:14 crc kubenswrapper[4565]: E1125 09:18:14.747690 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda\": container with ID starting with 8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda not found: ID does not exist" containerID="8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.747715 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda"} err="failed to get container status \"8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda\": rpc error: code = NotFound desc = could not find container \"8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda\": container with ID starting with 8d21984948c26ccc22ffa7aadb2dca55c07f5d111f625d12de60a147654a8bda not found: ID does not exist" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.747729 4565 scope.go:117] "RemoveContainer" containerID="bb8b27d7984a931ec91f84552a7977e84bb71cb712bd40bc7c52e5bd6d34273d" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.773081 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.782160 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4p45g"] Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.785113 4565 scope.go:117] "RemoveContainer" containerID="485c3dd2729d88cb7c4e3e1b2ca74d288f07cf5f068a524be56c5b4999f5a4e6" Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.794144 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:18:14 crc kubenswrapper[4565]: I1125 09:18:14.798190 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d8746976c-gdbcm"] Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.109874 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56879c95-6643-4472-993d-41fc2b340dc1" path="/var/lib/kubelet/pods/56879c95-6643-4472-993d-41fc2b340dc1/volumes" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.110566 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" path="/var/lib/kubelet/pods/a7063b92-d65c-49a6-bf4e-07c4801f8515/volumes" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.111486 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" path="/var/lib/kubelet/pods/f98fcb7a-0761-41fb-b312-3d7188057efc/volumes" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.354526 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.354872 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.394985 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.619470 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" event={"ID":"e61ca2e3-33ba-4887-9753-144f603688b9","Type":"ContainerStarted","Data":"87385280d79502c4ad17cf067ab21fe6201c1d2e7d6d26dbe29da377a8749cf6"} Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.619604 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.626386 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" event={"ID":"14667942-3d50-46be-9011-b33dafbd106e","Type":"ContainerStarted","Data":"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640"} Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.626558 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.636495 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abfe2157-f884-4325-8d80-7fa9b90754a9","Type":"ContainerStarted","Data":"e6c5b46c74a2ad6023a13d2cacfeda8c7f40ca0e433d4d28ec3cf4beefaa2bcd"} Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.636533 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abfe2157-f884-4325-8d80-7fa9b90754a9","Type":"ContainerStarted","Data":"9fdebb9b51934d1be92137330dced64202baf5926e39586251db5071922522ee"} Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.637374 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.641786 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" podStartSLOduration=3.641775812 podStartE2EDuration="3.641775812s" podCreationTimestamp="2025-11-25 09:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:15.635786501 +0000 UTC m=+828.838281639" watchObservedRunningTime="2025-11-25 09:18:15.641775812 +0000 UTC m=+828.844270950" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.654172 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" podStartSLOduration=3.654074798 podStartE2EDuration="3.654074798s" podCreationTimestamp="2025-11-25 09:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:15.649633656 +0000 UTC m=+828.852128794" watchObservedRunningTime="2025-11-25 09:18:15.654074798 +0000 UTC m=+828.856569936" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.665701 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.326708196 podStartE2EDuration="2.665684025s" podCreationTimestamp="2025-11-25 09:18:13 +0000 UTC" firstStartedPulling="2025-11-25 09:18:13.920043306 +0000 UTC m=+827.122538444" lastFinishedPulling="2025-11-25 09:18:15.259019135 +0000 UTC m=+828.461514273" observedRunningTime="2025-11-25 09:18:15.662804526 +0000 UTC m=+828.865299665" watchObservedRunningTime="2025-11-25 09:18:15.665684025 +0000 UTC m=+828.868179163" Nov 25 09:18:15 crc kubenswrapper[4565]: I1125 09:18:15.682862 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:16 crc kubenswrapper[4565]: I1125 09:18:16.121460 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:18:17 crc kubenswrapper[4565]: I1125 09:18:17.650093 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6qg8j" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="registry-server" containerID="cri-o://aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f" gracePeriod=2 Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.037474 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.148025 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities\") pod \"fed31d31-597e-4c84-b8da-77761f891338\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.148290 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content\") pod \"fed31d31-597e-4c84-b8da-77761f891338\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.148390 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xcbn\" (UniqueName: \"kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn\") pod \"fed31d31-597e-4c84-b8da-77761f891338\" (UID: \"fed31d31-597e-4c84-b8da-77761f891338\") " Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.149648 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities" (OuterVolumeSpecName: "utilities") pod "fed31d31-597e-4c84-b8da-77761f891338" (UID: "fed31d31-597e-4c84-b8da-77761f891338"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.154194 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn" (OuterVolumeSpecName: "kube-api-access-9xcbn") pod "fed31d31-597e-4c84-b8da-77761f891338" (UID: "fed31d31-597e-4c84-b8da-77761f891338"). InnerVolumeSpecName "kube-api-access-9xcbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.162002 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fed31d31-597e-4c84-b8da-77761f891338" (UID: "fed31d31-597e-4c84-b8da-77761f891338"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.250409 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.250662 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xcbn\" (UniqueName: \"kubernetes.io/projected/fed31d31-597e-4c84-b8da-77761f891338-kube-api-access-9xcbn\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.250675 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed31d31-597e-4c84-b8da-77761f891338-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.649410 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ztdrc"] Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.649963 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.649983 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650002 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="extract-content" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650008 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="extract-content" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650019 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="extract-utilities" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650025 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="extract-utilities" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650033 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="init" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650038 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="init" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650048 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="extract-content" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650053 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="extract-content" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650061 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650066 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650075 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="init" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650084 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="init" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650108 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="extract-utilities" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650113 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="extract-utilities" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650121 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650128 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.650139 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650148 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650370 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98fcb7a-0761-41fb-b312-3d7188057efc" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650384 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7063b92-d65c-49a6-bf4e-07c4801f8515" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650394 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed31d31-597e-4c84-b8da-77761f891338" containerName="registry-server" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.650402 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="56879c95-6643-4472-993d-41fc2b340dc1" containerName="dnsmasq-dns" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.653273 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.656074 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ztdrc"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.667052 4565 generic.go:334] "Generic (PLEG): container finished" podID="fed31d31-597e-4c84-b8da-77761f891338" containerID="aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f" exitCode=0 Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.667098 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerDied","Data":"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f"} Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.667129 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg8j" event={"ID":"fed31d31-597e-4c84-b8da-77761f891338","Type":"ContainerDied","Data":"bb4a0202440f277cea09693879aee74a93c816e95cfef22b03275b185fec91c2"} Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.667149 4565 scope.go:117] "RemoveContainer" containerID="aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.667282 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg8j" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.674784 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4bfd-account-create-cphjj"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.676196 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.678048 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.693237 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4bfd-account-create-cphjj"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.712218 4565 scope.go:117] "RemoveContainer" containerID="b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.731896 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.731968 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg8j"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.748435 4565 scope.go:117] "RemoveContainer" containerID="f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.759629 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.759727 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.759882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wktw\" (UniqueName: \"kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.759956 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4wx\" (UniqueName: \"kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.767139 4565 scope.go:117] "RemoveContainer" containerID="aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.767899 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f\": container with ID starting with aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f not found: ID does not exist" containerID="aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.768004 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f"} err="failed to get container status \"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f\": rpc error: code = NotFound desc = could not find container \"aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f\": container with ID starting with aabba421b0fef2cf8f36f5650994b775062e4a02775156d49f92a417399f415f not found: ID does not exist" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.768048 4565 scope.go:117] "RemoveContainer" containerID="b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.768453 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f\": container with ID starting with b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f not found: ID does not exist" containerID="b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.768499 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f"} err="failed to get container status \"b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f\": rpc error: code = NotFound desc = could not find container \"b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f\": container with ID starting with b2870d1d451620b0553ff8954551ddd7c3c7c97d05131c140c66e15dda26201f not found: ID does not exist" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.768533 4565 scope.go:117] "RemoveContainer" containerID="f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79" Nov 25 09:18:18 crc kubenswrapper[4565]: E1125 09:18:18.768843 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79\": container with ID starting with f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79 not found: ID does not exist" containerID="f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.768874 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79"} err="failed to get container status \"f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79\": rpc error: code = NotFound desc = could not find container \"f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79\": container with ID starting with f0f31dacaca13a52d5751f5c0fd50e1f6777409f0815f45eebcae50d4e355c79 not found: ID does not exist" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.859267 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p6cvm"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.860630 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.868857 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p6cvm"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.869263 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wktw\" (UniqueName: \"kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.869315 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4wx\" (UniqueName: \"kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.869446 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.869476 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.870583 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.870891 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.888712 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4wx\" (UniqueName: \"kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx\") pod \"keystone-4bfd-account-create-cphjj\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.889332 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wktw\" (UniqueName: \"kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw\") pod \"keystone-db-create-ztdrc\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.970539 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fbd2-account-create-5l9xb"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.970626 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.971699 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.972112 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796td\" (UniqueName: \"kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.972177 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.980453 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.986257 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fbd2-account-create-5l9xb"] Nov 25 09:18:18 crc kubenswrapper[4565]: I1125 09:18:18.994453 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.085505 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796td\" (UniqueName: \"kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.085877 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk928\" (UniqueName: \"kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.086009 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.086042 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.091573 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.128902 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796td\" (UniqueName: \"kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td\") pod \"placement-db-create-p6cvm\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.159656 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed31d31-597e-4c84-b8da-77761f891338" path="/var/lib/kubelet/pods/fed31d31-597e-4c84-b8da-77761f891338/volumes" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.161424 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5q2pv"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.165431 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5q2pv"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.165587 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.184063 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.188060 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk928\" (UniqueName: \"kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.188142 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.189773 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.190735 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f100-account-create-sj2lj"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.192302 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.194262 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.203521 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f100-account-create-sj2lj"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.234974 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk928\" (UniqueName: \"kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928\") pod \"placement-fbd2-account-create-5l9xb\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.281459 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4bfd-account-create-cphjj"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.290196 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llntn\" (UniqueName: \"kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.290312 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpfv\" (UniqueName: \"kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.290760 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.291175 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.335820 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.394139 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llntn\" (UniqueName: \"kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.394212 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpfv\" (UniqueName: \"kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.394255 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.394300 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.395027 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.395526 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.412960 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpfv\" (UniqueName: \"kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv\") pod \"glance-db-create-5q2pv\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.413769 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llntn\" (UniqueName: \"kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn\") pod \"glance-f100-account-create-sj2lj\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.500830 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.510980 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.519685 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ztdrc"] Nov 25 09:18:19 crc kubenswrapper[4565]: W1125 09:18:19.528811 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf7c33b_48ca_41a9_9d40_c0d12e5fa07b.slice/crio-80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7 WatchSource:0}: Error finding container 80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7: Status 404 returned error can't find the container with id 80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7 Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.650724 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p6cvm"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.677326 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ztdrc" event={"ID":"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b","Type":"ContainerStarted","Data":"80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7"} Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.680406 4565 generic.go:334] "Generic (PLEG): container finished" podID="ca742c16-4156-489a-8989-a25f91f6ef78" containerID="543858316b8b49fb47d3fe9c8bd5ad5195e17d16d0f91fad569631c21fb9476c" exitCode=0 Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.680523 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bfd-account-create-cphjj" event={"ID":"ca742c16-4156-489a-8989-a25f91f6ef78","Type":"ContainerDied","Data":"543858316b8b49fb47d3fe9c8bd5ad5195e17d16d0f91fad569631c21fb9476c"} Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.680539 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bfd-account-create-cphjj" event={"ID":"ca742c16-4156-489a-8989-a25f91f6ef78","Type":"ContainerStarted","Data":"0c72b9686b8e2062093094aaeae4d3a833a31b37e64823937eac397ec62215e6"} Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.764922 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fbd2-account-create-5l9xb"] Nov 25 09:18:19 crc kubenswrapper[4565]: W1125 09:18:19.979102 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bb9655_fc2f_44fd_8910_541992e2896a.slice/crio-20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789 WatchSource:0}: Error finding container 20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789: Status 404 returned error can't find the container with id 20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789 Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.982815 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5q2pv"] Nov 25 09:18:19 crc kubenswrapper[4565]: I1125 09:18:19.991112 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f100-account-create-sj2lj"] Nov 25 09:18:20 crc kubenswrapper[4565]: W1125 09:18:20.001920 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c48b8c0_09b0_4c38_94a6_9fc970eab3ca.slice/crio-6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf WatchSource:0}: Error finding container 6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf: Status 404 returned error can't find the container with id 6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.689461 4565 generic.go:334] "Generic (PLEG): container finished" podID="e7bb9655-fc2f-44fd-8910-541992e2896a" containerID="a34c4cfe9cd9fe86d1bce765c69b87614e8dce9aff0e02130376a298dea7ab51" exitCode=0 Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.689515 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5q2pv" event={"ID":"e7bb9655-fc2f-44fd-8910-541992e2896a","Type":"ContainerDied","Data":"a34c4cfe9cd9fe86d1bce765c69b87614e8dce9aff0e02130376a298dea7ab51"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.689790 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5q2pv" event={"ID":"e7bb9655-fc2f-44fd-8910-541992e2896a","Type":"ContainerStarted","Data":"20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.691708 4565 generic.go:334] "Generic (PLEG): container finished" podID="1d679723-ad82-4ea3-a423-46db64ebc105" containerID="f4ade216b44359441d097c21ab05b852dc569e854fbb2d38fca4dde013562a68" exitCode=0 Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.691796 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6cvm" event={"ID":"1d679723-ad82-4ea3-a423-46db64ebc105","Type":"ContainerDied","Data":"f4ade216b44359441d097c21ab05b852dc569e854fbb2d38fca4dde013562a68"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.691867 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6cvm" event={"ID":"1d679723-ad82-4ea3-a423-46db64ebc105","Type":"ContainerStarted","Data":"83c8a38aac75ea0c17cbd007b3359173eacb336eafcb50bbc597b5472fdffd0f"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.693617 4565 generic.go:334] "Generic (PLEG): container finished" podID="4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" containerID="56cffca7d8be4f95f11f1401ddb5d6785553f52070afa653c8ae083e887ca502" exitCode=0 Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.693645 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ztdrc" event={"ID":"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b","Type":"ContainerDied","Data":"56cffca7d8be4f95f11f1401ddb5d6785553f52070afa653c8ae083e887ca502"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.695686 4565 generic.go:334] "Generic (PLEG): container finished" podID="3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" containerID="2c02fd4fc7b83a3308e6cfff9270bdad297a061340017877b8c47702cb174d34" exitCode=0 Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.695740 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f100-account-create-sj2lj" event={"ID":"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca","Type":"ContainerDied","Data":"2c02fd4fc7b83a3308e6cfff9270bdad297a061340017877b8c47702cb174d34"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.695768 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f100-account-create-sj2lj" event={"ID":"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca","Type":"ContainerStarted","Data":"6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.698460 4565 generic.go:334] "Generic (PLEG): container finished" podID="d8124d68-2866-4758-9728-769b860913ee" containerID="f85e02a97582edb6076c9c3a9446e9b8296e9dc8468cb9e9a32cfaa151dd5441" exitCode=0 Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.698585 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbd2-account-create-5l9xb" event={"ID":"d8124d68-2866-4758-9728-769b860913ee","Type":"ContainerDied","Data":"f85e02a97582edb6076c9c3a9446e9b8296e9dc8468cb9e9a32cfaa151dd5441"} Nov 25 09:18:20 crc kubenswrapper[4565]: I1125 09:18:20.698618 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbd2-account-create-5l9xb" event={"ID":"d8124d68-2866-4758-9728-769b860913ee","Type":"ContainerStarted","Data":"877033227f7270d5855ebc07c60055f6f7efc57a4a49519e5dac7bbb29a167dc"} Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.036731 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.139559 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4wx\" (UniqueName: \"kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx\") pod \"ca742c16-4156-489a-8989-a25f91f6ef78\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.139626 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts\") pod \"ca742c16-4156-489a-8989-a25f91f6ef78\" (UID: \"ca742c16-4156-489a-8989-a25f91f6ef78\") " Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.140332 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca742c16-4156-489a-8989-a25f91f6ef78" (UID: "ca742c16-4156-489a-8989-a25f91f6ef78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.147494 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx" (OuterVolumeSpecName: "kube-api-access-nn4wx") pod "ca742c16-4156-489a-8989-a25f91f6ef78" (UID: "ca742c16-4156-489a-8989-a25f91f6ef78"). InnerVolumeSpecName "kube-api-access-nn4wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.243396 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4wx\" (UniqueName: \"kubernetes.io/projected/ca742c16-4156-489a-8989-a25f91f6ef78-kube-api-access-nn4wx\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.243433 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca742c16-4156-489a-8989-a25f91f6ef78-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.713791 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bfd-account-create-cphjj" Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.714446 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bfd-account-create-cphjj" event={"ID":"ca742c16-4156-489a-8989-a25f91f6ef78","Type":"ContainerDied","Data":"0c72b9686b8e2062093094aaeae4d3a833a31b37e64823937eac397ec62215e6"} Nov 25 09:18:21 crc kubenswrapper[4565]: I1125 09:18:21.715373 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c72b9686b8e2062093094aaeae4d3a833a31b37e64823937eac397ec62215e6" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.084204 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.158421 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts\") pod \"e7bb9655-fc2f-44fd-8910-541992e2896a\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.158593 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwpfv\" (UniqueName: \"kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv\") pod \"e7bb9655-fc2f-44fd-8910-541992e2896a\" (UID: \"e7bb9655-fc2f-44fd-8910-541992e2896a\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.161051 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7bb9655-fc2f-44fd-8910-541992e2896a" (UID: "e7bb9655-fc2f-44fd-8910-541992e2896a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.166330 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv" (OuterVolumeSpecName: "kube-api-access-cwpfv") pod "e7bb9655-fc2f-44fd-8910-541992e2896a" (UID: "e7bb9655-fc2f-44fd-8910-541992e2896a"). InnerVolumeSpecName "kube-api-access-cwpfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.263541 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb9655-fc2f-44fd-8910-541992e2896a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.263622 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwpfv\" (UniqueName: \"kubernetes.io/projected/e7bb9655-fc2f-44fd-8910-541992e2896a-kube-api-access-cwpfv\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.265466 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.271496 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.275764 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.279838 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.364947 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts\") pod \"1d679723-ad82-4ea3-a423-46db64ebc105\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365006 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wktw\" (UniqueName: \"kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw\") pod \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365057 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk928\" (UniqueName: \"kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928\") pod \"d8124d68-2866-4758-9728-769b860913ee\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365226 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts\") pod \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\" (UID: \"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365434 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-796td\" (UniqueName: \"kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td\") pod \"1d679723-ad82-4ea3-a423-46db64ebc105\" (UID: \"1d679723-ad82-4ea3-a423-46db64ebc105\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365491 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llntn\" (UniqueName: \"kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn\") pod \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365533 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts\") pod \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\" (UID: \"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365599 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d679723-ad82-4ea3-a423-46db64ebc105" (UID: "1d679723-ad82-4ea3-a423-46db64ebc105"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.365628 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts\") pod \"d8124d68-2866-4758-9728-769b860913ee\" (UID: \"d8124d68-2866-4758-9728-769b860913ee\") " Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.366178 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" (UID: "3c48b8c0-09b0-4c38-94a6-9fc970eab3ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.366401 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.366565 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d679723-ad82-4ea3-a423-46db64ebc105-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.366588 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8124d68-2866-4758-9728-769b860913ee" (UID: "d8124d68-2866-4758-9728-769b860913ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.366714 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" (UID: "4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.368552 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw" (OuterVolumeSpecName: "kube-api-access-8wktw") pod "4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" (UID: "4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b"). InnerVolumeSpecName "kube-api-access-8wktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.369552 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn" (OuterVolumeSpecName: "kube-api-access-llntn") pod "3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" (UID: "3c48b8c0-09b0-4c38-94a6-9fc970eab3ca"). InnerVolumeSpecName "kube-api-access-llntn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.369999 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928" (OuterVolumeSpecName: "kube-api-access-xk928") pod "d8124d68-2866-4758-9728-769b860913ee" (UID: "d8124d68-2866-4758-9728-769b860913ee"). InnerVolumeSpecName "kube-api-access-xk928". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.370464 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td" (OuterVolumeSpecName: "kube-api-access-796td") pod "1d679723-ad82-4ea3-a423-46db64ebc105" (UID: "1d679723-ad82-4ea3-a423-46db64ebc105"). InnerVolumeSpecName "kube-api-access-796td". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468343 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llntn\" (UniqueName: \"kubernetes.io/projected/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca-kube-api-access-llntn\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468373 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8124d68-2866-4758-9728-769b860913ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468386 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wktw\" (UniqueName: \"kubernetes.io/projected/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-kube-api-access-8wktw\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468397 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk928\" (UniqueName: \"kubernetes.io/projected/d8124d68-2866-4758-9728-769b860913ee-kube-api-access-xk928\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468407 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.468418 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-796td\" (UniqueName: \"kubernetes.io/projected/1d679723-ad82-4ea3-a423-46db64ebc105-kube-api-access-796td\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.724992 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f100-account-create-sj2lj" event={"ID":"3c48b8c0-09b0-4c38-94a6-9fc970eab3ca","Type":"ContainerDied","Data":"6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf"} Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.725373 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1efe36956dd077e2f70dfcdb261299e1eaa1e3609a3a426f81436e4e837ddf" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.725023 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f100-account-create-sj2lj" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.727006 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbd2-account-create-5l9xb" event={"ID":"d8124d68-2866-4758-9728-769b860913ee","Type":"ContainerDied","Data":"877033227f7270d5855ebc07c60055f6f7efc57a4a49519e5dac7bbb29a167dc"} Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.727040 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="877033227f7270d5855ebc07c60055f6f7efc57a4a49519e5dac7bbb29a167dc" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.727074 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbd2-account-create-5l9xb" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.730274 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5q2pv" event={"ID":"e7bb9655-fc2f-44fd-8910-541992e2896a","Type":"ContainerDied","Data":"20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789"} Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.730350 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e65a892b5bc675e6d3419a230ee3d4eab59a83121eb9c97491b27b96a53789" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.730465 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5q2pv" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.735620 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ztdrc" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.735627 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ztdrc" event={"ID":"4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b","Type":"ContainerDied","Data":"80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7"} Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.735789 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80fd0c56fd5c9642c697e1f39995c125f743adf2801d8dad08745e15046385d7" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.737752 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6cvm" event={"ID":"1d679723-ad82-4ea3-a423-46db64ebc105","Type":"ContainerDied","Data":"83c8a38aac75ea0c17cbd007b3359173eacb336eafcb50bbc597b5472fdffd0f"} Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.737847 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c8a38aac75ea0c17cbd007b3359173eacb336eafcb50bbc597b5472fdffd0f" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.737973 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6cvm" Nov 25 09:18:22 crc kubenswrapper[4565]: I1125 09:18:22.980110 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.073289 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.073535 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="dnsmasq-dns" containerID="cri-o://e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640" gracePeriod=10 Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.078109 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.542248 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.698513 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb\") pod \"14667942-3d50-46be-9011-b33dafbd106e\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.698550 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kkg5\" (UniqueName: \"kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5\") pod \"14667942-3d50-46be-9011-b33dafbd106e\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.698601 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config\") pod \"14667942-3d50-46be-9011-b33dafbd106e\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.698773 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc\") pod \"14667942-3d50-46be-9011-b33dafbd106e\" (UID: \"14667942-3d50-46be-9011-b33dafbd106e\") " Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.704538 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5" (OuterVolumeSpecName: "kube-api-access-9kkg5") pod "14667942-3d50-46be-9011-b33dafbd106e" (UID: "14667942-3d50-46be-9011-b33dafbd106e"). InnerVolumeSpecName "kube-api-access-9kkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.743475 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14667942-3d50-46be-9011-b33dafbd106e" (UID: "14667942-3d50-46be-9011-b33dafbd106e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.745300 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14667942-3d50-46be-9011-b33dafbd106e" (UID: "14667942-3d50-46be-9011-b33dafbd106e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.750697 4565 generic.go:334] "Generic (PLEG): container finished" podID="14667942-3d50-46be-9011-b33dafbd106e" containerID="e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640" exitCode=0 Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.750736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" event={"ID":"14667942-3d50-46be-9011-b33dafbd106e","Type":"ContainerDied","Data":"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640"} Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.750764 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" event={"ID":"14667942-3d50-46be-9011-b33dafbd106e","Type":"ContainerDied","Data":"82921b85fce122742864b118792626f0ceccee1e04b5199c0add1209c216c362"} Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.750782 4565 scope.go:117] "RemoveContainer" containerID="e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.750893 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-fmlwx" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.764733 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config" (OuterVolumeSpecName: "config") pod "14667942-3d50-46be-9011-b33dafbd106e" (UID: "14667942-3d50-46be-9011-b33dafbd106e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.774619 4565 scope.go:117] "RemoveContainer" containerID="58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.787082 4565 scope.go:117] "RemoveContainer" containerID="e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640" Nov 25 09:18:23 crc kubenswrapper[4565]: E1125 09:18:23.787600 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640\": container with ID starting with e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640 not found: ID does not exist" containerID="e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.787645 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640"} err="failed to get container status \"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640\": rpc error: code = NotFound desc = could not find container \"e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640\": container with ID starting with e42239f08fa606f3f22acd263dcf04ab1f10b79dbb153fd12b1fc265975f4640 not found: ID does not exist" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.787663 4565 scope.go:117] "RemoveContainer" containerID="58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4" Nov 25 09:18:23 crc kubenswrapper[4565]: E1125 09:18:23.788015 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4\": container with ID starting with 58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4 not found: ID does not exist" containerID="58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.788045 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4"} err="failed to get container status \"58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4\": rpc error: code = NotFound desc = could not find container \"58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4\": container with ID starting with 58d5819fc63534500aac82f9324594955048aacec1b686ccbe71866102a38bd4 not found: ID does not exist" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.800528 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.800552 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.800562 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kkg5\" (UniqueName: \"kubernetes.io/projected/14667942-3d50-46be-9011-b33dafbd106e-kube-api-access-9kkg5\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:23 crc kubenswrapper[4565]: I1125 09:18:23.800570 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14667942-3d50-46be-9011-b33dafbd106e-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.081055 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.092046 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-fmlwx"] Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330013 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pmdd7"] Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330398 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="dnsmasq-dns" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330418 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="dnsmasq-dns" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330426 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330432 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330448 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8124d68-2866-4758-9728-769b860913ee" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330455 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8124d68-2866-4758-9728-769b860913ee" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330468 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330473 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330482 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="init" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330487 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="init" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330496 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca742c16-4156-489a-8989-a25f91f6ef78" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330502 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca742c16-4156-489a-8989-a25f91f6ef78" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330511 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d679723-ad82-4ea3-a423-46db64ebc105" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330518 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d679723-ad82-4ea3-a423-46db64ebc105" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: E1125 09:18:24.330525 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bb9655-fc2f-44fd-8910-541992e2896a" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330531 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bb9655-fc2f-44fd-8910-541992e2896a" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330696 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="14667942-3d50-46be-9011-b33dafbd106e" containerName="dnsmasq-dns" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330708 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8124d68-2866-4758-9728-769b860913ee" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330718 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330728 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330735 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca742c16-4156-489a-8989-a25f91f6ef78" containerName="mariadb-account-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330742 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bb9655-fc2f-44fd-8910-541992e2896a" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.330750 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d679723-ad82-4ea3-a423-46db64ebc105" containerName="mariadb-database-create" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.331377 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.333184 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.334257 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j54bw" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.338941 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pmdd7"] Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.412024 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.412091 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.412267 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.412324 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fvq\" (UniqueName: \"kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.513414 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fvq\" (UniqueName: \"kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.513522 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.513559 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.513608 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.519179 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.519189 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.528752 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.529457 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fvq\" (UniqueName: \"kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq\") pod \"glance-db-sync-pmdd7\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:24 crc kubenswrapper[4565]: I1125 09:18:24.644672 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:25 crc kubenswrapper[4565]: I1125 09:18:25.113193 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14667942-3d50-46be-9011-b33dafbd106e" path="/var/lib/kubelet/pods/14667942-3d50-46be-9011-b33dafbd106e/volumes" Nov 25 09:18:25 crc kubenswrapper[4565]: I1125 09:18:25.119541 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pmdd7"] Nov 25 09:18:25 crc kubenswrapper[4565]: W1125 09:18:25.121214 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod405219a8_725f_4996_9efa_02837290d5e8.slice/crio-0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4 WatchSource:0}: Error finding container 0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4: Status 404 returned error can't find the container with id 0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4 Nov 25 09:18:25 crc kubenswrapper[4565]: I1125 09:18:25.766480 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pmdd7" event={"ID":"405219a8-725f-4996-9efa-02837290d5e8","Type":"ContainerStarted","Data":"0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4"} Nov 25 09:18:28 crc kubenswrapper[4565]: I1125 09:18:28.453059 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 25 09:18:28 crc kubenswrapper[4565]: I1125 09:18:28.812908 4565 generic.go:334] "Generic (PLEG): container finished" podID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerID="e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c" exitCode=0 Nov 25 09:18:28 crc kubenswrapper[4565]: I1125 09:18:28.813004 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerDied","Data":"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c"} Nov 25 09:18:28 crc kubenswrapper[4565]: I1125 09:18:28.816398 4565 generic.go:334] "Generic (PLEG): container finished" podID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerID="d94652e304634ec33bfb162b4c2b317c7742bd20ae9567f935e47470498b93ac" exitCode=0 Nov 25 09:18:28 crc kubenswrapper[4565]: I1125 09:18:28.816436 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerDied","Data":"d94652e304634ec33bfb162b4c2b317c7742bd20ae9567f935e47470498b93ac"} Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.829872 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerStarted","Data":"45b00b7971a56b1f1127cbd3fba4aae085b77e7b2f1cee486f8a50ffb35f30f7"} Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.830457 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.834082 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerStarted","Data":"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04"} Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.834324 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.856377 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.66859121 podStartE2EDuration="56.856359687s" podCreationTimestamp="2025-11-25 09:17:33 +0000 UTC" firstStartedPulling="2025-11-25 09:17:35.648382324 +0000 UTC m=+788.850877462" lastFinishedPulling="2025-11-25 09:17:55.836150802 +0000 UTC m=+809.038645939" observedRunningTime="2025-11-25 09:18:29.855639519 +0000 UTC m=+843.058134657" watchObservedRunningTime="2025-11-25 09:18:29.856359687 +0000 UTC m=+843.058854825" Nov 25 09:18:29 crc kubenswrapper[4565]: I1125 09:18:29.886227 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.126913637 podStartE2EDuration="55.886201195s" podCreationTimestamp="2025-11-25 09:17:34 +0000 UTC" firstStartedPulling="2025-11-25 09:17:36.012301668 +0000 UTC m=+789.214796796" lastFinishedPulling="2025-11-25 09:17:55.771589216 +0000 UTC m=+808.974084354" observedRunningTime="2025-11-25 09:18:29.881424641 +0000 UTC m=+843.083919778" watchObservedRunningTime="2025-11-25 09:18:29.886201195 +0000 UTC m=+843.088696333" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.732127 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.733657 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.743027 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.833704 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcx2m\" (UniqueName: \"kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.833801 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.833899 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.853741 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7k468" podUID="67e2fa61-acc9-415b-9e10-0a35b6a3feb7" containerName="ovn-controller" probeResult="failure" output=< Nov 25 09:18:35 crc kubenswrapper[4565]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 09:18:35 crc kubenswrapper[4565]: > Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.921695 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.935228 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcx2m\" (UniqueName: \"kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.935316 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.935398 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.935864 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.936085 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:35 crc kubenswrapper[4565]: I1125 09:18:35.985394 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcx2m\" (UniqueName: \"kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m\") pod \"redhat-operators-67p5z\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:36 crc kubenswrapper[4565]: I1125 09:18:36.047799 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.295066 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:40 crc kubenswrapper[4565]: W1125 09:18:40.299555 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92bcfb29_921b_482a_97be_e651dc2d0ff0.slice/crio-04ed6cd387d0d0d27991e29bbffd998939a59c9e657449eb7b80d93b1895b455 WatchSource:0}: Error finding container 04ed6cd387d0d0d27991e29bbffd998939a59c9e657449eb7b80d93b1895b455: Status 404 returned error can't find the container with id 04ed6cd387d0d0d27991e29bbffd998939a59c9e657449eb7b80d93b1895b455 Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.839432 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7k468" podUID="67e2fa61-acc9-415b-9e10-0a35b6a3feb7" containerName="ovn-controller" probeResult="failure" output=< Nov 25 09:18:40 crc kubenswrapper[4565]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 09:18:40 crc kubenswrapper[4565]: > Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.898339 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qhxwx" Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.976328 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pmdd7" event={"ID":"405219a8-725f-4996-9efa-02837290d5e8","Type":"ContainerStarted","Data":"21b8de1db821f19fcd7a96e05e125e48165f9c3e90de2342de80a33e5a1481aa"} Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.980077 4565 generic.go:334] "Generic (PLEG): container finished" podID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerID="805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15" exitCode=0 Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.980128 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerDied","Data":"805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15"} Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.980173 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerStarted","Data":"04ed6cd387d0d0d27991e29bbffd998939a59c9e657449eb7b80d93b1895b455"} Nov 25 09:18:40 crc kubenswrapper[4565]: I1125 09:18:40.994519 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pmdd7" podStartSLOduration=2.21628295 podStartE2EDuration="16.99450446s" podCreationTimestamp="2025-11-25 09:18:24 +0000 UTC" firstStartedPulling="2025-11-25 09:18:25.122579884 +0000 UTC m=+838.325075022" lastFinishedPulling="2025-11-25 09:18:39.900801404 +0000 UTC m=+853.103296532" observedRunningTime="2025-11-25 09:18:40.992375446 +0000 UTC m=+854.194870584" watchObservedRunningTime="2025-11-25 09:18:40.99450446 +0000 UTC m=+854.196999588" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.149656 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7k468-config-rqzs6"] Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.152018 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.164423 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.170620 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7k468-config-rqzs6"] Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.249921 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.250151 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.250235 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.250293 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.250537 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvxg\" (UniqueName: \"kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.250651 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352206 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvxg\" (UniqueName: \"kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352304 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352370 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352448 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352474 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352500 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352734 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352788 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.352848 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.353720 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.354911 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.373198 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvxg\" (UniqueName: \"kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg\") pod \"ovn-controller-7k468-config-rqzs6\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.484450 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.938544 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7k468-config-rqzs6"] Nov 25 09:18:41 crc kubenswrapper[4565]: W1125 09:18:41.939050 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bfcaccd_2faf_4d76_97a2_e6f47fea26b2.slice/crio-5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299 WatchSource:0}: Error finding container 5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299: Status 404 returned error can't find the container with id 5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299 Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.993918 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7k468-config-rqzs6" event={"ID":"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2","Type":"ContainerStarted","Data":"5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299"} Nov 25 09:18:41 crc kubenswrapper[4565]: I1125 09:18:41.997249 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerStarted","Data":"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967"} Nov 25 09:18:43 crc kubenswrapper[4565]: I1125 09:18:43.009914 4565 generic.go:334] "Generic (PLEG): container finished" podID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerID="65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967" exitCode=0 Nov 25 09:18:43 crc kubenswrapper[4565]: I1125 09:18:43.009994 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerDied","Data":"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967"} Nov 25 09:18:43 crc kubenswrapper[4565]: I1125 09:18:43.014556 4565 generic.go:334] "Generic (PLEG): container finished" podID="0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" containerID="12ee085019ef236d8049f29ca4d408b616bcc6469785d3af3a3aaba23ecb87af" exitCode=0 Nov 25 09:18:43 crc kubenswrapper[4565]: I1125 09:18:43.014614 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7k468-config-rqzs6" event={"ID":"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2","Type":"ContainerDied","Data":"12ee085019ef236d8049f29ca4d408b616bcc6469785d3af3a3aaba23ecb87af"} Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.024536 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerStarted","Data":"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be"} Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.044336 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-67p5z" podStartSLOduration=6.511545751 podStartE2EDuration="9.044312636s" podCreationTimestamp="2025-11-25 09:18:35 +0000 UTC" firstStartedPulling="2025-11-25 09:18:40.982643099 +0000 UTC m=+854.185138237" lastFinishedPulling="2025-11-25 09:18:43.515409984 +0000 UTC m=+856.717905122" observedRunningTime="2025-11-25 09:18:44.042355375 +0000 UTC m=+857.244850513" watchObservedRunningTime="2025-11-25 09:18:44.044312636 +0000 UTC m=+857.246807773" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.337258 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426323 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426368 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426396 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426415 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426446 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run" (OuterVolumeSpecName: "var-run") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426450 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426504 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvxg\" (UniqueName: \"kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426547 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426556 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts\") pod \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\" (UID: \"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2\") " Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426875 4565 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426889 4565 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.426897 4565 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.427111 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.427391 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts" (OuterVolumeSpecName: "scripts") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.432538 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg" (OuterVolumeSpecName: "kube-api-access-qzvxg") pod "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" (UID: "0bfcaccd-2faf-4d76-97a2-e6f47fea26b2"). InnerVolumeSpecName "kube-api-access-qzvxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.528098 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.528135 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvxg\" (UniqueName: \"kubernetes.io/projected/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-kube-api-access-qzvxg\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:44 crc kubenswrapper[4565]: I1125 09:18:44.528149 4565 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.026094 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.033785 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7k468-config-rqzs6" event={"ID":"0bfcaccd-2faf-4d76-97a2-e6f47fea26b2","Type":"ContainerDied","Data":"5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299"} Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.033818 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7k468-config-rqzs6" Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.033826 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5707a4189e4a00f1a6280acc60bf7e13de33d0ef5bd954f845d2c73ad5c72299" Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.036131 4565 generic.go:334] "Generic (PLEG): container finished" podID="405219a8-725f-4996-9efa-02837290d5e8" containerID="21b8de1db821f19fcd7a96e05e125e48165f9c3e90de2342de80a33e5a1481aa" exitCode=0 Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.036166 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pmdd7" event={"ID":"405219a8-725f-4996-9efa-02837290d5e8","Type":"ContainerDied","Data":"21b8de1db821f19fcd7a96e05e125e48165f9c3e90de2342de80a33e5a1481aa"} Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.375138 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.470647 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7k468-config-rqzs6"] Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.477403 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7k468-config-rqzs6"] Nov 25 09:18:45 crc kubenswrapper[4565]: I1125 09:18:45.837264 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7k468" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.048001 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.048048 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.494445 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.576184 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8fvq\" (UniqueName: \"kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq\") pod \"405219a8-725f-4996-9efa-02837290d5e8\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.576286 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data\") pod \"405219a8-725f-4996-9efa-02837290d5e8\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.576354 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data\") pod \"405219a8-725f-4996-9efa-02837290d5e8\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.576438 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle\") pod \"405219a8-725f-4996-9efa-02837290d5e8\" (UID: \"405219a8-725f-4996-9efa-02837290d5e8\") " Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.593870 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq" (OuterVolumeSpecName: "kube-api-access-g8fvq") pod "405219a8-725f-4996-9efa-02837290d5e8" (UID: "405219a8-725f-4996-9efa-02837290d5e8"). InnerVolumeSpecName "kube-api-access-g8fvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.609964 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "405219a8-725f-4996-9efa-02837290d5e8" (UID: "405219a8-725f-4996-9efa-02837290d5e8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.611853 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "405219a8-725f-4996-9efa-02837290d5e8" (UID: "405219a8-725f-4996-9efa-02837290d5e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.645645 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data" (OuterVolumeSpecName: "config-data") pod "405219a8-725f-4996-9efa-02837290d5e8" (UID: "405219a8-725f-4996-9efa-02837290d5e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.678841 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.678960 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8fvq\" (UniqueName: \"kubernetes.io/projected/405219a8-725f-4996-9efa-02837290d5e8-kube-api-access-g8fvq\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.679030 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:46 crc kubenswrapper[4565]: I1125 09:18:46.679086 4565 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/405219a8-725f-4996-9efa-02837290d5e8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.056603 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pmdd7" event={"ID":"405219a8-725f-4996-9efa-02837290d5e8","Type":"ContainerDied","Data":"0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4"} Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.056648 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae3dd8f154238fbad589420e68af025c3f6ac9f3d32944a39fcf4718acdace4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.056646 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pmdd7" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.106053 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" path="/var/lib/kubelet/pods/0bfcaccd-2faf-4d76-97a2-e6f47fea26b2/volumes" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.122076 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fs8dc"] Nov 25 09:18:47 crc kubenswrapper[4565]: E1125 09:18:47.124154 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405219a8-725f-4996-9efa-02837290d5e8" containerName="glance-db-sync" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.124290 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="405219a8-725f-4996-9efa-02837290d5e8" containerName="glance-db-sync" Nov 25 09:18:47 crc kubenswrapper[4565]: E1125 09:18:47.124381 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" containerName="ovn-config" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.124433 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" containerName="ovn-config" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.124634 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="405219a8-725f-4996-9efa-02837290d5e8" containerName="glance-db-sync" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.124698 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bfcaccd-2faf-4d76-97a2-e6f47fea26b2" containerName="ovn-config" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.125294 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.131002 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-67p5z" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="registry-server" probeResult="failure" output=< Nov 25 09:18:47 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:18:47 crc kubenswrapper[4565]: > Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.143802 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fs8dc"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.188945 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9jrd\" (UniqueName: \"kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.189008 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.212121 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-c2zj4"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.213275 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.230475 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-c2zj4"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.291536 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.291630 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.291761 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65kt7\" (UniqueName: \"kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.291872 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9jrd\" (UniqueName: \"kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.292562 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.320535 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-19b0-account-create-g2zxc"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.322102 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.323862 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9jrd\" (UniqueName: \"kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd\") pod \"barbican-db-create-fs8dc\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.336210 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.348117 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-19b0-account-create-g2zxc"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.393242 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.393323 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.393373 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.393436 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65kt7\" (UniqueName: \"kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.394094 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.430829 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65kt7\" (UniqueName: \"kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7\") pod \"cinder-db-create-c2zj4\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.444305 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.489008 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0b9e-account-create-ztz2x"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.490067 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.494952 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.495142 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.495431 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.495848 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.509615 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0b9e-account-create-ztz2x"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.527845 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.535776 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4cj9f"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.536866 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.552288 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq\") pod \"barbican-19b0-account-create-g2zxc\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.569104 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4cj9f"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.597048 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchdk\" (UniqueName: \"kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.597107 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696bc\" (UniqueName: \"kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.597131 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.597195 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.641918 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.650049 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dbb-account-create-gq67j"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.651380 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.664363 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.673951 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.675304 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.688766 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dbb-account-create-gq67j"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699257 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699296 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699360 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwlj\" (UniqueName: \"kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699394 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchdk\" (UniqueName: \"kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699442 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696bc\" (UniqueName: \"kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699461 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699491 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699513 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699568 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699597 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xlp8\" (UniqueName: \"kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.699675 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.700608 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.700841 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.701613 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.731209 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchdk\" (UniqueName: \"kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk\") pod \"neutron-db-create-4cj9f\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.774376 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bssz7"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.784435 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696bc\" (UniqueName: \"kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc\") pod \"cinder-0b9e-account-create-ztz2x\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.810919 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.852796 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.853106 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xlp8\" (UniqueName: \"kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.853256 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.853496 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.853559 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.853734 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwlj\" (UniqueName: \"kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.811719 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.855978 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.857053 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.857784 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.863007 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.863635 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bssz7"] Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.863710 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.865231 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.871332 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tx2k9" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.879275 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.879993 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.883422 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwlj\" (UniqueName: \"kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj\") pod \"dnsmasq-dns-569d458467-6r4gl\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.890437 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.900439 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.906453 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xlp8\" (UniqueName: \"kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8\") pod \"neutron-5dbb-account-create-gq67j\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.956051 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkdtv\" (UniqueName: \"kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.956324 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:47 crc kubenswrapper[4565]: I1125 09:18:47.956442 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.018993 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.044238 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.059593 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.059702 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkdtv\" (UniqueName: \"kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.059773 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.075351 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.075891 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.096067 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkdtv\" (UniqueName: \"kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv\") pod \"keystone-db-sync-bssz7\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.247592 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.406830 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fs8dc"] Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.509831 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4cj9f"] Nov 25 09:18:48 crc kubenswrapper[4565]: W1125 09:18:48.539280 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f29e809_94fa_4ff1_85d9_b1a6786a1763.slice/crio-f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844 WatchSource:0}: Error finding container f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844: Status 404 returned error can't find the container with id f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844 Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.698193 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="80ebb406-0240-4ba6-86f1-177776f19865" containerName="galera" probeResult="failure" output="command timed out" Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.817960 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-19b0-account-create-g2zxc"] Nov 25 09:18:48 crc kubenswrapper[4565]: W1125 09:18:48.836914 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc2d8baf_2d1d_463c_b340_04a7a920de12.slice/crio-bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac WatchSource:0}: Error finding container bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac: Status 404 returned error can't find the container with id bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.875229 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-c2zj4"] Nov 25 09:18:48 crc kubenswrapper[4565]: I1125 09:18:48.895715 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dbb-account-create-gq67j"] Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.083137 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4cj9f" event={"ID":"0f29e809-94fa-4ff1-85d9-b1a6786a1763","Type":"ContainerStarted","Data":"f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844"} Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.085457 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fs8dc" event={"ID":"d26517de-faae-4251-84d1-e9626a575d10","Type":"ContainerStarted","Data":"eb6ba4c75a0863ee7fd8b96616ca419cdc3a4466fb534de1c91e14cfedff55d5"} Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.086890 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-19b0-account-create-g2zxc" event={"ID":"dc2d8baf-2d1d-463c-b340-04a7a920de12","Type":"ContainerStarted","Data":"bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac"} Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.176593 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0b9e-account-create-ztz2x"] Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.192139 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bssz7"] Nov 25 09:18:49 crc kubenswrapper[4565]: I1125 09:18:49.207960 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.093766 4565 generic.go:334] "Generic (PLEG): container finished" podID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerID="e5876a624ab2421d5cff515ae1a4cc21046d44088906f59cb0968ded0e4bce59" exitCode=0 Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.093943 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d458467-6r4gl" event={"ID":"66545593-8b73-4d2e-9f66-4eb259ef6752","Type":"ContainerDied","Data":"e5876a624ab2421d5cff515ae1a4cc21046d44088906f59cb0968ded0e4bce59"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.094180 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d458467-6r4gl" event={"ID":"66545593-8b73-4d2e-9f66-4eb259ef6752","Type":"ContainerStarted","Data":"f99c2c3c0e0c86a2019eb76bc6c3963f2406c72e8cee221275969f509f8d4086"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.102954 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0b9e-account-create-ztz2x" event={"ID":"2005d542-3de1-4147-ad79-56126eccafdd","Type":"ContainerStarted","Data":"a2c03d5a24bdffb51829758b74b615e235cc84bff56fa4b9659cd4ef32d4b727"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.103059 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0b9e-account-create-ztz2x" event={"ID":"2005d542-3de1-4147-ad79-56126eccafdd","Type":"ContainerStarted","Data":"83114b34b891c8866d4fc7207e6aca5270a31eb8c4d4e129a9899dd65ebc14cf"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.109301 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c2zj4" event={"ID":"a355a7eb-1738-4c1b-975b-767821b77d5a","Type":"ContainerStarted","Data":"a5be5ca82d369b44e99466facfc5e44236be158a00d4f8b0e1ff5bb38019e1b3"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.109408 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c2zj4" event={"ID":"a355a7eb-1738-4c1b-975b-767821b77d5a","Type":"ContainerStarted","Data":"afeef16b663362b64115cd07f751d8a762a409cca2e294ac365bc11d44f06ee7"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.111999 4565 generic.go:334] "Generic (PLEG): container finished" podID="0f29e809-94fa-4ff1-85d9-b1a6786a1763" containerID="cb0fc42d94f3da9546b989d13eb383ca0590743c2fa36cc77ab49f64f5d6a8ad" exitCode=0 Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.112089 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4cj9f" event={"ID":"0f29e809-94fa-4ff1-85d9-b1a6786a1763","Type":"ContainerDied","Data":"cb0fc42d94f3da9546b989d13eb383ca0590743c2fa36cc77ab49f64f5d6a8ad"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.117743 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dbb-account-create-gq67j" event={"ID":"6b86a550-d12f-4a27-abae-e115391cbb13","Type":"ContainerStarted","Data":"a8e2f4be35e77c4696ff5c86e735ce6fa04a8a3578040e82843b64261146dff7"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.117792 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dbb-account-create-gq67j" event={"ID":"6b86a550-d12f-4a27-abae-e115391cbb13","Type":"ContainerStarted","Data":"06511dd9fe59f8bb150d9db1ea5a2104d4e64a05eeaf622c7a96b0d85372ca30"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.120650 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bssz7" event={"ID":"2df05272-b4de-41ac-8d18-16c29398c0d4","Type":"ContainerStarted","Data":"043ff5fa2fad2dd9488b7270b3c870909049b4c186e19165361fdda24c85b5ee"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.122272 4565 generic.go:334] "Generic (PLEG): container finished" podID="d26517de-faae-4251-84d1-e9626a575d10" containerID="4439f437b0343e7149841b89b99d546734bf695a5c31c28f5e85896dfcb24c89" exitCode=0 Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.122327 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fs8dc" event={"ID":"d26517de-faae-4251-84d1-e9626a575d10","Type":"ContainerDied","Data":"4439f437b0343e7149841b89b99d546734bf695a5c31c28f5e85896dfcb24c89"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.127010 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-19b0-account-create-g2zxc" event={"ID":"dc2d8baf-2d1d-463c-b340-04a7a920de12","Type":"ContainerStarted","Data":"f31b0601f66bb2a9d97783adb1fbfcc52a5ea8dff246d30bf3c1496084e3134e"} Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.127356 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-0b9e-account-create-ztz2x" podStartSLOduration=3.127340424 podStartE2EDuration="3.127340424s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:50.126109934 +0000 UTC m=+863.328605071" watchObservedRunningTime="2025-11-25 09:18:50.127340424 +0000 UTC m=+863.329835562" Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.158656 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-c2zj4" podStartSLOduration=3.15864098 podStartE2EDuration="3.15864098s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:50.140205806 +0000 UTC m=+863.342700945" watchObservedRunningTime="2025-11-25 09:18:50.15864098 +0000 UTC m=+863.361136118" Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.183227 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-19b0-account-create-g2zxc" podStartSLOduration=3.1832118 podStartE2EDuration="3.1832118s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:50.151312014 +0000 UTC m=+863.353807152" watchObservedRunningTime="2025-11-25 09:18:50.1832118 +0000 UTC m=+863.385706938" Nov 25 09:18:50 crc kubenswrapper[4565]: I1125 09:18:50.185739 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dbb-account-create-gq67j" podStartSLOduration=3.185732673 podStartE2EDuration="3.185732673s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:50.162077118 +0000 UTC m=+863.364572266" watchObservedRunningTime="2025-11-25 09:18:50.185732673 +0000 UTC m=+863.388227811" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.136003 4565 generic.go:334] "Generic (PLEG): container finished" podID="dc2d8baf-2d1d-463c-b340-04a7a920de12" containerID="f31b0601f66bb2a9d97783adb1fbfcc52a5ea8dff246d30bf3c1496084e3134e" exitCode=0 Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.136107 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-19b0-account-create-g2zxc" event={"ID":"dc2d8baf-2d1d-463c-b340-04a7a920de12","Type":"ContainerDied","Data":"f31b0601f66bb2a9d97783adb1fbfcc52a5ea8dff246d30bf3c1496084e3134e"} Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.138671 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d458467-6r4gl" event={"ID":"66545593-8b73-4d2e-9f66-4eb259ef6752","Type":"ContainerStarted","Data":"88a2e475737f9205d7eddd768e245c3341c0cfe05c417ab65bb93947aa664bcb"} Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.138819 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.140234 4565 generic.go:334] "Generic (PLEG): container finished" podID="2005d542-3de1-4147-ad79-56126eccafdd" containerID="a2c03d5a24bdffb51829758b74b615e235cc84bff56fa4b9659cd4ef32d4b727" exitCode=0 Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.140344 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0b9e-account-create-ztz2x" event={"ID":"2005d542-3de1-4147-ad79-56126eccafdd","Type":"ContainerDied","Data":"a2c03d5a24bdffb51829758b74b615e235cc84bff56fa4b9659cd4ef32d4b727"} Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.141662 4565 generic.go:334] "Generic (PLEG): container finished" podID="a355a7eb-1738-4c1b-975b-767821b77d5a" containerID="a5be5ca82d369b44e99466facfc5e44236be158a00d4f8b0e1ff5bb38019e1b3" exitCode=0 Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.141715 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c2zj4" event={"ID":"a355a7eb-1738-4c1b-975b-767821b77d5a","Type":"ContainerDied","Data":"a5be5ca82d369b44e99466facfc5e44236be158a00d4f8b0e1ff5bb38019e1b3"} Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.146010 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b86a550-d12f-4a27-abae-e115391cbb13" containerID="a8e2f4be35e77c4696ff5c86e735ce6fa04a8a3578040e82843b64261146dff7" exitCode=0 Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.146258 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dbb-account-create-gq67j" event={"ID":"6b86a550-d12f-4a27-abae-e115391cbb13","Type":"ContainerDied","Data":"a8e2f4be35e77c4696ff5c86e735ce6fa04a8a3578040e82843b64261146dff7"} Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.200063 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-569d458467-6r4gl" podStartSLOduration=4.200042859 podStartE2EDuration="4.200042859s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:18:51.186042105 +0000 UTC m=+864.388537233" watchObservedRunningTime="2025-11-25 09:18:51.200042859 +0000 UTC m=+864.402537987" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.599212 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.615403 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.663852 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts\") pod \"d26517de-faae-4251-84d1-e9626a575d10\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.663968 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9jrd\" (UniqueName: \"kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd\") pod \"d26517de-faae-4251-84d1-e9626a575d10\" (UID: \"d26517de-faae-4251-84d1-e9626a575d10\") " Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.664413 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xchdk\" (UniqueName: \"kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk\") pod \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.664518 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts\") pod \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\" (UID: \"0f29e809-94fa-4ff1-85d9-b1a6786a1763\") " Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.664641 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d26517de-faae-4251-84d1-e9626a575d10" (UID: "d26517de-faae-4251-84d1-e9626a575d10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.665107 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f29e809-94fa-4ff1-85d9-b1a6786a1763" (UID: "0f29e809-94fa-4ff1-85d9-b1a6786a1763"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.665299 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f29e809-94fa-4ff1-85d9-b1a6786a1763-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.665325 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26517de-faae-4251-84d1-e9626a575d10-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.672491 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk" (OuterVolumeSpecName: "kube-api-access-xchdk") pod "0f29e809-94fa-4ff1-85d9-b1a6786a1763" (UID: "0f29e809-94fa-4ff1-85d9-b1a6786a1763"). InnerVolumeSpecName "kube-api-access-xchdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.679358 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd" (OuterVolumeSpecName: "kube-api-access-f9jrd") pod "d26517de-faae-4251-84d1-e9626a575d10" (UID: "d26517de-faae-4251-84d1-e9626a575d10"). InnerVolumeSpecName "kube-api-access-f9jrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.767571 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9jrd\" (UniqueName: \"kubernetes.io/projected/d26517de-faae-4251-84d1-e9626a575d10-kube-api-access-f9jrd\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:51 crc kubenswrapper[4565]: I1125 09:18:51.767598 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xchdk\" (UniqueName: \"kubernetes.io/projected/0f29e809-94fa-4ff1-85d9-b1a6786a1763-kube-api-access-xchdk\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.163215 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4cj9f" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.163270 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4cj9f" event={"ID":"0f29e809-94fa-4ff1-85d9-b1a6786a1763","Type":"ContainerDied","Data":"f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844"} Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.163748 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6142e4e2b89b0480c3d43e6554d35029e36782d0943a490c7bc8438c7ef2844" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.170802 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fs8dc" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.171048 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fs8dc" event={"ID":"d26517de-faae-4251-84d1-e9626a575d10","Type":"ContainerDied","Data":"eb6ba4c75a0863ee7fd8b96616ca419cdc3a4466fb534de1c91e14cfedff55d5"} Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.171079 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb6ba4c75a0863ee7fd8b96616ca419cdc3a4466fb534de1c91e14cfedff55d5" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.577064 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.699691 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65kt7\" (UniqueName: \"kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7\") pod \"a355a7eb-1738-4c1b-975b-767821b77d5a\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.699800 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts\") pod \"a355a7eb-1738-4c1b-975b-767821b77d5a\" (UID: \"a355a7eb-1738-4c1b-975b-767821b77d5a\") " Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.700389 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a355a7eb-1738-4c1b-975b-767821b77d5a" (UID: "a355a7eb-1738-4c1b-975b-767821b77d5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.700652 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a355a7eb-1738-4c1b-975b-767821b77d5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.721422 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7" (OuterVolumeSpecName: "kube-api-access-65kt7") pod "a355a7eb-1738-4c1b-975b-767821b77d5a" (UID: "a355a7eb-1738-4c1b-975b-767821b77d5a"). InnerVolumeSpecName "kube-api-access-65kt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:52 crc kubenswrapper[4565]: I1125 09:18:52.801863 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65kt7\" (UniqueName: \"kubernetes.io/projected/a355a7eb-1738-4c1b-975b-767821b77d5a-kube-api-access-65kt7\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:53 crc kubenswrapper[4565]: I1125 09:18:53.195460 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c2zj4" event={"ID":"a355a7eb-1738-4c1b-975b-767821b77d5a","Type":"ContainerDied","Data":"afeef16b663362b64115cd07f751d8a762a409cca2e294ac365bc11d44f06ee7"} Nov 25 09:18:53 crc kubenswrapper[4565]: I1125 09:18:53.195505 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afeef16b663362b64115cd07f751d8a762a409cca2e294ac365bc11d44f06ee7" Nov 25 09:18:53 crc kubenswrapper[4565]: I1125 09:18:53.195565 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c2zj4" Nov 25 09:18:54 crc kubenswrapper[4565]: I1125 09:18:54.965903 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:54 crc kubenswrapper[4565]: I1125 09:18:54.973116 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:54 crc kubenswrapper[4565]: I1125 09:18:54.994446 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038037 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts\") pod \"dc2d8baf-2d1d-463c-b340-04a7a920de12\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038082 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts\") pod \"2005d542-3de1-4147-ad79-56126eccafdd\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038132 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts\") pod \"6b86a550-d12f-4a27-abae-e115391cbb13\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038162 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq\") pod \"dc2d8baf-2d1d-463c-b340-04a7a920de12\" (UID: \"dc2d8baf-2d1d-463c-b340-04a7a920de12\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038217 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696bc\" (UniqueName: \"kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc\") pod \"2005d542-3de1-4147-ad79-56126eccafdd\" (UID: \"2005d542-3de1-4147-ad79-56126eccafdd\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.038301 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xlp8\" (UniqueName: \"kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8\") pod \"6b86a550-d12f-4a27-abae-e115391cbb13\" (UID: \"6b86a550-d12f-4a27-abae-e115391cbb13\") " Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.039340 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc2d8baf-2d1d-463c-b340-04a7a920de12" (UID: "dc2d8baf-2d1d-463c-b340-04a7a920de12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.039351 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2005d542-3de1-4147-ad79-56126eccafdd" (UID: "2005d542-3de1-4147-ad79-56126eccafdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.039639 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b86a550-d12f-4a27-abae-e115391cbb13" (UID: "6b86a550-d12f-4a27-abae-e115391cbb13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.042161 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc" (OuterVolumeSpecName: "kube-api-access-696bc") pod "2005d542-3de1-4147-ad79-56126eccafdd" (UID: "2005d542-3de1-4147-ad79-56126eccafdd"). InnerVolumeSpecName "kube-api-access-696bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.042822 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8" (OuterVolumeSpecName: "kube-api-access-2xlp8") pod "6b86a550-d12f-4a27-abae-e115391cbb13" (UID: "6b86a550-d12f-4a27-abae-e115391cbb13"). InnerVolumeSpecName "kube-api-access-2xlp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.045125 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq" (OuterVolumeSpecName: "kube-api-access-rwpqq") pod "dc2d8baf-2d1d-463c-b340-04a7a920de12" (UID: "dc2d8baf-2d1d-463c-b340-04a7a920de12"). InnerVolumeSpecName "kube-api-access-rwpqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.140798 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d8baf-2d1d-463c-b340-04a7a920de12-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.140970 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2005d542-3de1-4147-ad79-56126eccafdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.141046 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b86a550-d12f-4a27-abae-e115391cbb13-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.141318 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwpqq\" (UniqueName: \"kubernetes.io/projected/dc2d8baf-2d1d-463c-b340-04a7a920de12-kube-api-access-rwpqq\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.142802 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696bc\" (UniqueName: \"kubernetes.io/projected/2005d542-3de1-4147-ad79-56126eccafdd-kube-api-access-696bc\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.142837 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xlp8\" (UniqueName: \"kubernetes.io/projected/6b86a550-d12f-4a27-abae-e115391cbb13-kube-api-access-2xlp8\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.211543 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dbb-account-create-gq67j" event={"ID":"6b86a550-d12f-4a27-abae-e115391cbb13","Type":"ContainerDied","Data":"06511dd9fe59f8bb150d9db1ea5a2104d4e64a05eeaf622c7a96b0d85372ca30"} Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.211585 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06511dd9fe59f8bb150d9db1ea5a2104d4e64a05eeaf622c7a96b0d85372ca30" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.211554 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dbb-account-create-gq67j" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.214527 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bssz7" event={"ID":"2df05272-b4de-41ac-8d18-16c29398c0d4","Type":"ContainerStarted","Data":"012796fe09ecc36f52792d0175b31e322ecf7a3d334f4b8f042a6f239a937fbe"} Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.216469 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-19b0-account-create-g2zxc" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.216465 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-19b0-account-create-g2zxc" event={"ID":"dc2d8baf-2d1d-463c-b340-04a7a920de12","Type":"ContainerDied","Data":"bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac"} Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.216580 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf05b04dd177d5dd91781e8ef8474827a433d459e2d356306bf864d4206299ac" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.221782 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0b9e-account-create-ztz2x" event={"ID":"2005d542-3de1-4147-ad79-56126eccafdd","Type":"ContainerDied","Data":"83114b34b891c8866d4fc7207e6aca5270a31eb8c4d4e129a9899dd65ebc14cf"} Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.221810 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83114b34b891c8866d4fc7207e6aca5270a31eb8c4d4e129a9899dd65ebc14cf" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.221833 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0b9e-account-create-ztz2x" Nov 25 09:18:55 crc kubenswrapper[4565]: I1125 09:18:55.228689 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bssz7" podStartSLOduration=2.95007068 podStartE2EDuration="8.228679422s" podCreationTimestamp="2025-11-25 09:18:47 +0000 UTC" firstStartedPulling="2025-11-25 09:18:49.582662834 +0000 UTC m=+862.785157972" lastFinishedPulling="2025-11-25 09:18:54.861271576 +0000 UTC m=+868.063766714" observedRunningTime="2025-11-25 09:18:55.226419912 +0000 UTC m=+868.428915050" watchObservedRunningTime="2025-11-25 09:18:55.228679422 +0000 UTC m=+868.431174559" Nov 25 09:18:56 crc kubenswrapper[4565]: I1125 09:18:56.086027 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:56 crc kubenswrapper[4565]: I1125 09:18:56.122433 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:56 crc kubenswrapper[4565]: I1125 09:18:56.330822 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.242004 4565 generic.go:334] "Generic (PLEG): container finished" podID="2df05272-b4de-41ac-8d18-16c29398c0d4" containerID="012796fe09ecc36f52792d0175b31e322ecf7a3d334f4b8f042a6f239a937fbe" exitCode=0 Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.242078 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bssz7" event={"ID":"2df05272-b4de-41ac-8d18-16c29398c0d4","Type":"ContainerDied","Data":"012796fe09ecc36f52792d0175b31e322ecf7a3d334f4b8f042a6f239a937fbe"} Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.242507 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-67p5z" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="registry-server" containerID="cri-o://1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be" gracePeriod=2 Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.687787 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.796984 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcx2m\" (UniqueName: \"kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m\") pod \"92bcfb29-921b-482a-97be-e651dc2d0ff0\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.797178 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content\") pod \"92bcfb29-921b-482a-97be-e651dc2d0ff0\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.797307 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities\") pod \"92bcfb29-921b-482a-97be-e651dc2d0ff0\" (UID: \"92bcfb29-921b-482a-97be-e651dc2d0ff0\") " Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.798185 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities" (OuterVolumeSpecName: "utilities") pod "92bcfb29-921b-482a-97be-e651dc2d0ff0" (UID: "92bcfb29-921b-482a-97be-e651dc2d0ff0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.798588 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.803293 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m" (OuterVolumeSpecName: "kube-api-access-dcx2m") pod "92bcfb29-921b-482a-97be-e651dc2d0ff0" (UID: "92bcfb29-921b-482a-97be-e651dc2d0ff0"). InnerVolumeSpecName "kube-api-access-dcx2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.869072 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92bcfb29-921b-482a-97be-e651dc2d0ff0" (UID: "92bcfb29-921b-482a-97be-e651dc2d0ff0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.901035 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcx2m\" (UniqueName: \"kubernetes.io/projected/92bcfb29-921b-482a-97be-e651dc2d0ff0-kube-api-access-dcx2m\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:57 crc kubenswrapper[4565]: I1125 09:18:57.901088 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bcfb29-921b-482a-97be-e651dc2d0ff0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.046077 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.106430 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.106635 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="dnsmasq-dns" containerID="cri-o://87385280d79502c4ad17cf067ab21fe6201c1d2e7d6d26dbe29da377a8749cf6" gracePeriod=10 Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.285199 4565 generic.go:334] "Generic (PLEG): container finished" podID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerID="1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be" exitCode=0 Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.285539 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerDied","Data":"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be"} Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.285598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67p5z" event={"ID":"92bcfb29-921b-482a-97be-e651dc2d0ff0","Type":"ContainerDied","Data":"04ed6cd387d0d0d27991e29bbffd998939a59c9e657449eb7b80d93b1895b455"} Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.285619 4565 scope.go:117] "RemoveContainer" containerID="1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.285821 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67p5z" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.303551 4565 generic.go:334] "Generic (PLEG): container finished" podID="e61ca2e3-33ba-4887-9753-144f603688b9" containerID="87385280d79502c4ad17cf067ab21fe6201c1d2e7d6d26dbe29da377a8749cf6" exitCode=0 Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.303729 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" event={"ID":"e61ca2e3-33ba-4887-9753-144f603688b9","Type":"ContainerDied","Data":"87385280d79502c4ad17cf067ab21fe6201c1d2e7d6d26dbe29da377a8749cf6"} Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.403365 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.419190 4565 scope.go:117] "RemoveContainer" containerID="65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.421775 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-67p5z"] Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.461128 4565 scope.go:117] "RemoveContainer" containerID="805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.501357 4565 scope.go:117] "RemoveContainer" containerID="1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be" Nov 25 09:18:58 crc kubenswrapper[4565]: E1125 09:18:58.501976 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be\": container with ID starting with 1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be not found: ID does not exist" containerID="1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.502040 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be"} err="failed to get container status \"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be\": rpc error: code = NotFound desc = could not find container \"1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be\": container with ID starting with 1bb60afe064f448e07e5eed2a8792637b12565d56c2c242aeaebe4ae849e99be not found: ID does not exist" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.502072 4565 scope.go:117] "RemoveContainer" containerID="65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967" Nov 25 09:18:58 crc kubenswrapper[4565]: E1125 09:18:58.502347 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967\": container with ID starting with 65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967 not found: ID does not exist" containerID="65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.502369 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967"} err="failed to get container status \"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967\": rpc error: code = NotFound desc = could not find container \"65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967\": container with ID starting with 65263ed301573bdc1c06512f776112473cbc8d1b8380004e44fae9f9e91fb967 not found: ID does not exist" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.502403 4565 scope.go:117] "RemoveContainer" containerID="805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15" Nov 25 09:18:58 crc kubenswrapper[4565]: E1125 09:18:58.502600 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15\": container with ID starting with 805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15 not found: ID does not exist" containerID="805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.502641 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15"} err="failed to get container status \"805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15\": rpc error: code = NotFound desc = could not find container \"805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15\": container with ID starting with 805d41259d918cab1942959ff410d7f8f2bcf65c92a98749408d26661e405a15 not found: ID does not exist" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.772169 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.819589 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkdtv\" (UniqueName: \"kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv\") pod \"2df05272-b4de-41ac-8d18-16c29398c0d4\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.819759 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle\") pod \"2df05272-b4de-41ac-8d18-16c29398c0d4\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.819800 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data\") pod \"2df05272-b4de-41ac-8d18-16c29398c0d4\" (UID: \"2df05272-b4de-41ac-8d18-16c29398c0d4\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.825377 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv" (OuterVolumeSpecName: "kube-api-access-zkdtv") pod "2df05272-b4de-41ac-8d18-16c29398c0d4" (UID: "2df05272-b4de-41ac-8d18-16c29398c0d4"). InnerVolumeSpecName "kube-api-access-zkdtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.826918 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.851067 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2df05272-b4de-41ac-8d18-16c29398c0d4" (UID: "2df05272-b4de-41ac-8d18-16c29398c0d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.856895 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data" (OuterVolumeSpecName: "config-data") pod "2df05272-b4de-41ac-8d18-16c29398c0d4" (UID: "2df05272-b4de-41ac-8d18-16c29398c0d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.921900 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config\") pod \"e61ca2e3-33ba-4887-9753-144f603688b9\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.921992 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb\") pod \"e61ca2e3-33ba-4887-9753-144f603688b9\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922042 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb\") pod \"e61ca2e3-33ba-4887-9753-144f603688b9\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922113 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc\") pod \"e61ca2e3-33ba-4887-9753-144f603688b9\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922174 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvlj\" (UniqueName: \"kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj\") pod \"e61ca2e3-33ba-4887-9753-144f603688b9\" (UID: \"e61ca2e3-33ba-4887-9753-144f603688b9\") " Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922900 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkdtv\" (UniqueName: \"kubernetes.io/projected/2df05272-b4de-41ac-8d18-16c29398c0d4-kube-api-access-zkdtv\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922912 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.922941 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df05272-b4de-41ac-8d18-16c29398c0d4-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.929140 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj" (OuterVolumeSpecName: "kube-api-access-tpvlj") pod "e61ca2e3-33ba-4887-9753-144f603688b9" (UID: "e61ca2e3-33ba-4887-9753-144f603688b9"). InnerVolumeSpecName "kube-api-access-tpvlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.952068 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e61ca2e3-33ba-4887-9753-144f603688b9" (UID: "e61ca2e3-33ba-4887-9753-144f603688b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.957603 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e61ca2e3-33ba-4887-9753-144f603688b9" (UID: "e61ca2e3-33ba-4887-9753-144f603688b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.958189 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e61ca2e3-33ba-4887-9753-144f603688b9" (UID: "e61ca2e3-33ba-4887-9753-144f603688b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:58 crc kubenswrapper[4565]: I1125 09:18:58.962062 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config" (OuterVolumeSpecName: "config") pod "e61ca2e3-33ba-4887-9753-144f603688b9" (UID: "e61ca2e3-33ba-4887-9753-144f603688b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.025298 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpvlj\" (UniqueName: \"kubernetes.io/projected/e61ca2e3-33ba-4887-9753-144f603688b9-kube-api-access-tpvlj\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.025357 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.025382 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.025393 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.025404 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e61ca2e3-33ba-4887-9753-144f603688b9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.105572 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" path="/var/lib/kubelet/pods/92bcfb29-921b-482a-97be-e651dc2d0ff0/volumes" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.312284 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bssz7" event={"ID":"2df05272-b4de-41ac-8d18-16c29398c0d4","Type":"ContainerDied","Data":"043ff5fa2fad2dd9488b7270b3c870909049b4c186e19165361fdda24c85b5ee"} Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.312328 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="043ff5fa2fad2dd9488b7270b3c870909049b4c186e19165361fdda24c85b5ee" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.312302 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bssz7" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.314495 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" event={"ID":"e61ca2e3-33ba-4887-9753-144f603688b9","Type":"ContainerDied","Data":"b7ab00e93d21e428d806dd7f3862f62794b6df8e61ee08a09a751dd7150de0bd"} Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.314550 4565 scope.go:117] "RemoveContainer" containerID="87385280d79502c4ad17cf067ab21fe6201c1d2e7d6d26dbe29da377a8749cf6" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.314649 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-m6c7g" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.330310 4565 scope.go:117] "RemoveContainer" containerID="68384ca66545b8d936cecc7e68c189ced4aff0bcd961517f1e32d583455b4fe9" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.335665 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.344906 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-m6c7g"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505012 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505345 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="extract-content" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505364 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="extract-content" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505375 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2d8baf-2d1d-463c-b340-04a7a920de12" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505381 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2d8baf-2d1d-463c-b340-04a7a920de12" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505390 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="registry-server" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505397 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="registry-server" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505407 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b86a550-d12f-4a27-abae-e115391cbb13" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505412 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b86a550-d12f-4a27-abae-e115391cbb13" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505419 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29e809-94fa-4ff1-85d9-b1a6786a1763" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505424 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29e809-94fa-4ff1-85d9-b1a6786a1763" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505436 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a355a7eb-1738-4c1b-975b-767821b77d5a" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505441 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a355a7eb-1738-4c1b-975b-767821b77d5a" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505455 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="init" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505460 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="init" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505468 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2005d542-3de1-4147-ad79-56126eccafdd" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505474 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2005d542-3de1-4147-ad79-56126eccafdd" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505487 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="extract-utilities" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505492 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="extract-utilities" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505502 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26517de-faae-4251-84d1-e9626a575d10" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505507 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26517de-faae-4251-84d1-e9626a575d10" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505515 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="dnsmasq-dns" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505521 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="dnsmasq-dns" Nov 25 09:18:59 crc kubenswrapper[4565]: E1125 09:18:59.505533 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df05272-b4de-41ac-8d18-16c29398c0d4" containerName="keystone-db-sync" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505538 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df05272-b4de-41ac-8d18-16c29398c0d4" containerName="keystone-db-sync" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505683 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2d8baf-2d1d-463c-b340-04a7a920de12" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505692 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26517de-faae-4251-84d1-e9626a575d10" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505699 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="2005d542-3de1-4147-ad79-56126eccafdd" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505707 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bcfb29-921b-482a-97be-e651dc2d0ff0" containerName="registry-server" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505715 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f29e809-94fa-4ff1-85d9-b1a6786a1763" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505723 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" containerName="dnsmasq-dns" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505729 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b86a550-d12f-4a27-abae-e115391cbb13" containerName="mariadb-account-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505739 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a355a7eb-1738-4c1b-975b-767821b77d5a" containerName="mariadb-database-create" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.505749 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df05272-b4de-41ac-8d18-16c29398c0d4" containerName="keystone-db-sync" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.506535 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.520775 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.561414 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nv87t"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.562730 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.566110 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.566290 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.566301 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tx2k9" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.566796 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.566804 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.573599 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nv87t"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.635724 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnvt\" (UniqueName: \"kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.635864 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.635960 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvll\" (UniqueName: \"kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636022 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636043 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636112 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636192 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636246 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636281 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636303 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.636384 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.701494 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.703208 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.706198 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.712373 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.715289 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737448 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737487 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737535 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737561 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737583 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737613 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737632 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737652 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p24l\" (UniqueName: \"kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737669 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737690 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737715 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737744 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737773 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737811 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnvt\" (UniqueName: \"kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737873 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737894 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737935 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvll\" (UniqueName: \"kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.737967 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.739033 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.740564 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.741109 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.741656 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.746765 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.747804 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.759451 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.760288 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.760764 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.765423 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnvt\" (UniqueName: \"kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt\") pod \"dnsmasq-dns-b76c757b7-z9pfj\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.766839 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvll\" (UniqueName: \"kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll\") pod \"keystone-bootstrap-nv87t\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.819041 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.843855 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.843914 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844033 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844063 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p24l\" (UniqueName: \"kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844083 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844114 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844141 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.844560 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.845570 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.850358 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.856563 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.856883 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.858031 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.868225 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-66vnw"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.869491 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66vnw" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.874450 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgk4c" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.874637 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.874758 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.876474 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9gszq"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.878033 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.882475 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.882643 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t466t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.882752 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.883364 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.913718 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-66vnw"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.928032 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9gszq"] Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.940203 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p24l\" (UniqueName: \"kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l\") pod \"ceilometer-0\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " pod="openstack/ceilometer-0" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.946698 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.946761 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cb2n\" (UniqueName: \"kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.946863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.946917 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.946978 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.947077 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.947099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8gv\" (UniqueName: \"kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.947206 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:18:59 crc kubenswrapper[4565]: I1125 09:18:59.947244 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.010989 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.032463 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.056800 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.057115 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.057344 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.057427 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cb2n\" (UniqueName: \"kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.058955 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.059032 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.059075 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.059187 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.059207 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8gv\" (UniqueName: \"kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.062702 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.075534 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.077664 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.078226 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.078541 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.079365 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.101875 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.111901 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8gv\" (UniqueName: \"kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv\") pod \"cinder-db-sync-9gszq\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.118968 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cb2n\" (UniqueName: \"kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n\") pod \"neutron-db-sync-66vnw\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.133226 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.155472 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.166838 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.167180 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.167331 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gjz\" (UniqueName: \"kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.167394 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.167654 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.167966 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.199711 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bs9qc"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.201313 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.203568 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kgbtq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.203833 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.208914 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bs9qc"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.227014 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wz4lt"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.228324 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.230748 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ntkbc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.230777 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.233699 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.243656 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.245745 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wz4lt"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.272938 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273109 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273207 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273345 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273428 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8mh2\" (UniqueName: \"kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273514 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273597 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273660 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqqn\" (UniqueName: \"kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273732 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gjz\" (UniqueName: \"kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273801 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.273893 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.274023 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.274126 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.274976 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.275567 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.276172 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.277009 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.295167 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gjz\" (UniqueName: \"kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz\") pod \"dnsmasq-dns-66f4bdbdb7-p7zfn\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.303636 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376654 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376836 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376894 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376912 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8mh2\" (UniqueName: \"kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376961 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.376996 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqqn\" (UniqueName: \"kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.377047 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.377073 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.381616 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.385269 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.385308 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.385721 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.386569 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.393311 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.405183 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8mh2\" (UniqueName: \"kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2\") pod \"placement-db-sync-wz4lt\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.408772 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqqn\" (UniqueName: \"kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn\") pod \"barbican-db-sync-bs9qc\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.504106 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.519276 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.531981 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nv87t"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.549551 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:00 crc kubenswrapper[4565]: W1125 09:19:00.576450 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c331635_d5f5_4abd_8765_6448ebcc79a3.slice/crio-be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b WatchSource:0}: Error finding container be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b: Status 404 returned error can't find the container with id be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.613375 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.724955 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.854661 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-66vnw"] Nov 25 09:19:00 crc kubenswrapper[4565]: I1125 09:19:00.970278 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9gszq"] Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.040548 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.118364 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61ca2e3-33ba-4887-9753-144f603688b9" path="/var/lib/kubelet/pods/e61ca2e3-33ba-4887-9753-144f603688b9/volumes" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.145685 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bs9qc"] Nov 25 09:19:01 crc kubenswrapper[4565]: W1125 09:19:01.163213 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfad94c28_1e87_4955_9d65_ca6cfdd4087f.slice/crio-2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4 WatchSource:0}: Error finding container 2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4: Status 404 returned error can't find the container with id 2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4 Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.267563 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wz4lt"] Nov 25 09:19:01 crc kubenswrapper[4565]: W1125 09:19:01.276067 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod754ce7b0_5a2a_4206_a901_94010cec0e08.slice/crio-a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927 WatchSource:0}: Error finding container a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927: Status 404 returned error can't find the container with id a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927 Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.343132 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bs9qc" event={"ID":"fad94c28-1e87-4955-9d65-ca6cfdd4087f","Type":"ContainerStarted","Data":"2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.344165 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz4lt" event={"ID":"754ce7b0-5a2a-4206-a901-94010cec0e08","Type":"ContainerStarted","Data":"a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.345323 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9gszq" event={"ID":"33e21a69-41ac-4166-bfeb-6ec0eaff7e64","Type":"ContainerStarted","Data":"ca9c80a53639c414326f75639004cd0157fdce9c1a2fd60c7f76f7909114dd1c"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.346646 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66vnw" event={"ID":"7b855517-9e82-425c-ab21-321b63053540","Type":"ContainerStarted","Data":"33440ebbdd12140fa0cd8cca99e21d5931715b4988161be46d272172afd80dfd"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.346713 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66vnw" event={"ID":"7b855517-9e82-425c-ab21-321b63053540","Type":"ContainerStarted","Data":"dec9b8c8a1d231a2378082a79dc5c95ec9cf01bdcc460ee2aa1c189cd8167c19"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.347918 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerStarted","Data":"8a86dabb0127021e49202ea997988b534802f23f7473261464a13dec5c45ff68"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.348890 4565 generic.go:334] "Generic (PLEG): container finished" podID="5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" containerID="3e9122571e41f48711765f566d82e2ac004a086b31bb7dcef3837bb022eb2b27" exitCode=0 Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.348953 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" event={"ID":"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a","Type":"ContainerDied","Data":"3e9122571e41f48711765f566d82e2ac004a086b31bb7dcef3837bb022eb2b27"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.348989 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" event={"ID":"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a","Type":"ContainerStarted","Data":"7dd765f911683d3ac4f64eef955a1ad99bc20177b708b403d852e54dddc5e499"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.353658 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nv87t" event={"ID":"5c331635-d5f5-4abd-8765-6448ebcc79a3","Type":"ContainerStarted","Data":"7dc8e423018162e4767da1db52a8154d522ede88c87127b6cbc39e39f3b8acbf"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.353699 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nv87t" event={"ID":"5c331635-d5f5-4abd-8765-6448ebcc79a3","Type":"ContainerStarted","Data":"be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.355368 4565 generic.go:334] "Generic (PLEG): container finished" podID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerID="7ba29dbbe591c43f090eb6e1e2fb9f59fc83f7e19bbdb0720daaa60758cbe31d" exitCode=0 Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.355423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" event={"ID":"82113c4e-4aec-4aed-8edf-8918d49c64de","Type":"ContainerDied","Data":"7ba29dbbe591c43f090eb6e1e2fb9f59fc83f7e19bbdb0720daaa60758cbe31d"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.355449 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" event={"ID":"82113c4e-4aec-4aed-8edf-8918d49c64de","Type":"ContainerStarted","Data":"a958588604917f26db6b4079bfd97435a61ce1fe3153859bcd32fb537fd844f6"} Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.379605 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-66vnw" podStartSLOduration=2.379585803 podStartE2EDuration="2.379585803s" podCreationTimestamp="2025-11-25 09:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:01.362464315 +0000 UTC m=+874.564959453" watchObservedRunningTime="2025-11-25 09:19:01.379585803 +0000 UTC m=+874.582080940" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.448885 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nv87t" podStartSLOduration=2.448864585 podStartE2EDuration="2.448864585s" podCreationTimestamp="2025-11-25 09:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:01.447554456 +0000 UTC m=+874.650049584" watchObservedRunningTime="2025-11-25 09:19:01.448864585 +0000 UTC m=+874.651359723" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.731957 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.829446 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb\") pod \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.829510 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config\") pod \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.829636 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrnvt\" (UniqueName: \"kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt\") pod \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.829660 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc\") pod \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.829688 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb\") pod \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\" (UID: \"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a\") " Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.854046 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt" (OuterVolumeSpecName: "kube-api-access-jrnvt") pod "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" (UID: "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a"). InnerVolumeSpecName "kube-api-access-jrnvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.860904 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" (UID: "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.864352 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" (UID: "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.868366 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config" (OuterVolumeSpecName: "config") pod "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" (UID: "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.874718 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" (UID: "5280e045-f6ed-4ddf-ba48-3d67d7d63c9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.933944 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrnvt\" (UniqueName: \"kubernetes.io/projected/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-kube-api-access-jrnvt\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.933983 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.933994 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.934003 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:01 crc kubenswrapper[4565]: I1125 09:19:01.934016 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.070399 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.378754 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.381473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b76c757b7-z9pfj" event={"ID":"5280e045-f6ed-4ddf-ba48-3d67d7d63c9a","Type":"ContainerDied","Data":"7dd765f911683d3ac4f64eef955a1ad99bc20177b708b403d852e54dddc5e499"} Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.381538 4565 scope.go:117] "RemoveContainer" containerID="3e9122571e41f48711765f566d82e2ac004a086b31bb7dcef3837bb022eb2b27" Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.405094 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" event={"ID":"82113c4e-4aec-4aed-8edf-8918d49c64de","Type":"ContainerStarted","Data":"c6291dff42c38991e0c84554070fc41a1701223885d0eb0a69034b4b89e7f0fe"} Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.405529 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.449397 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" podStartSLOduration=3.4493698569999998 podStartE2EDuration="3.449369857s" podCreationTimestamp="2025-11-25 09:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:02.437598657 +0000 UTC m=+875.640093784" watchObservedRunningTime="2025-11-25 09:19:02.449369857 +0000 UTC m=+875.651864995" Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.497264 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:19:02 crc kubenswrapper[4565]: I1125 09:19:02.552434 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b76c757b7-z9pfj"] Nov 25 09:19:03 crc kubenswrapper[4565]: I1125 09:19:03.109068 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" path="/var/lib/kubelet/pods/5280e045-f6ed-4ddf-ba48-3d67d7d63c9a/volumes" Nov 25 09:19:04 crc kubenswrapper[4565]: I1125 09:19:04.435764 4565 generic.go:334] "Generic (PLEG): container finished" podID="5c331635-d5f5-4abd-8765-6448ebcc79a3" containerID="7dc8e423018162e4767da1db52a8154d522ede88c87127b6cbc39e39f3b8acbf" exitCode=0 Nov 25 09:19:04 crc kubenswrapper[4565]: I1125 09:19:04.435831 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nv87t" event={"ID":"5c331635-d5f5-4abd-8765-6448ebcc79a3","Type":"ContainerDied","Data":"7dc8e423018162e4767da1db52a8154d522ede88c87127b6cbc39e39f3b8acbf"} Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.760238 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.845424 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvll\" (UniqueName: \"kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.845665 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.845878 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.846134 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.846360 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.846451 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle\") pod \"5c331635-d5f5-4abd-8765-6448ebcc79a3\" (UID: \"5c331635-d5f5-4abd-8765-6448ebcc79a3\") " Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.854228 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts" (OuterVolumeSpecName: "scripts") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.854355 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.855747 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.874618 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll" (OuterVolumeSpecName: "kube-api-access-ptvll") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "kube-api-access-ptvll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.875891 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data" (OuterVolumeSpecName: "config-data") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.886678 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c331635-d5f5-4abd-8765-6448ebcc79a3" (UID: "5c331635-d5f5-4abd-8765-6448ebcc79a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.949858 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.950177 4565 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.950191 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.950204 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvll\" (UniqueName: \"kubernetes.io/projected/5c331635-d5f5-4abd-8765-6448ebcc79a3-kube-api-access-ptvll\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.950215 4565 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:06 crc kubenswrapper[4565]: I1125 09:19:06.950224 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c331635-d5f5-4abd-8765-6448ebcc79a3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.474325 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nv87t" event={"ID":"5c331635-d5f5-4abd-8765-6448ebcc79a3","Type":"ContainerDied","Data":"be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b"} Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.474390 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be21933d8250ee91297086c2bff8373854637ce2eacee0bc2b24f60dd4783f6b" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.474391 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nv87t" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.848995 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nv87t"] Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.860206 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nv87t"] Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.973439 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4k5dr"] Nov 25 09:19:07 crc kubenswrapper[4565]: E1125 09:19:07.974276 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" containerName="init" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.974303 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" containerName="init" Nov 25 09:19:07 crc kubenswrapper[4565]: E1125 09:19:07.974340 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c331635-d5f5-4abd-8765-6448ebcc79a3" containerName="keystone-bootstrap" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.974349 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c331635-d5f5-4abd-8765-6448ebcc79a3" containerName="keystone-bootstrap" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.974806 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c331635-d5f5-4abd-8765-6448ebcc79a3" containerName="keystone-bootstrap" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.974842 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5280e045-f6ed-4ddf-ba48-3d67d7d63c9a" containerName="init" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.975764 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4k5dr"] Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.975884 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.983222 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.983385 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tx2k9" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.983634 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.983815 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 09:19:07 crc kubenswrapper[4565]: I1125 09:19:07.983839 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.089303 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.089562 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.089587 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.089686 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hht65\" (UniqueName: \"kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.090062 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.090105 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191582 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191671 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191695 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191729 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hht65\" (UniqueName: \"kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191810 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.191835 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.196896 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.196947 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.197405 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.198139 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.198601 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.207809 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hht65\" (UniqueName: \"kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65\") pod \"keystone-bootstrap-4k5dr\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:08 crc kubenswrapper[4565]: I1125 09:19:08.304465 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:09 crc kubenswrapper[4565]: I1125 09:19:09.106744 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c331635-d5f5-4abd-8765-6448ebcc79a3" path="/var/lib/kubelet/pods/5c331635-d5f5-4abd-8765-6448ebcc79a3/volumes" Nov 25 09:19:09 crc kubenswrapper[4565]: I1125 09:19:09.494908 4565 generic.go:334] "Generic (PLEG): container finished" podID="7b855517-9e82-425c-ab21-321b63053540" containerID="33440ebbdd12140fa0cd8cca99e21d5931715b4988161be46d272172afd80dfd" exitCode=0 Nov 25 09:19:09 crc kubenswrapper[4565]: I1125 09:19:09.494983 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66vnw" event={"ID":"7b855517-9e82-425c-ab21-321b63053540","Type":"ContainerDied","Data":"33440ebbdd12140fa0cd8cca99e21d5931715b4988161be46d272172afd80dfd"} Nov 25 09:19:10 crc kubenswrapper[4565]: I1125 09:19:10.506229 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:10 crc kubenswrapper[4565]: I1125 09:19:10.565041 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:19:10 crc kubenswrapper[4565]: I1125 09:19:10.565712 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-569d458467-6r4gl" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" containerID="cri-o://88a2e475737f9205d7eddd768e245c3341c0cfe05c417ab65bb93947aa664bcb" gracePeriod=10 Nov 25 09:19:11 crc kubenswrapper[4565]: I1125 09:19:11.517373 4565 generic.go:334] "Generic (PLEG): container finished" podID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerID="88a2e475737f9205d7eddd768e245c3341c0cfe05c417ab65bb93947aa664bcb" exitCode=0 Nov 25 09:19:11 crc kubenswrapper[4565]: I1125 09:19:11.517416 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d458467-6r4gl" event={"ID":"66545593-8b73-4d2e-9f66-4eb259ef6752","Type":"ContainerDied","Data":"88a2e475737f9205d7eddd768e245c3341c0cfe05c417ab65bb93947aa664bcb"} Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.045472 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-569d458467-6r4gl" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.286016 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.395785 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config\") pod \"7b855517-9e82-425c-ab21-321b63053540\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.396119 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle\") pod \"7b855517-9e82-425c-ab21-321b63053540\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.396191 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cb2n\" (UniqueName: \"kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n\") pod \"7b855517-9e82-425c-ab21-321b63053540\" (UID: \"7b855517-9e82-425c-ab21-321b63053540\") " Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.400772 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n" (OuterVolumeSpecName: "kube-api-access-7cb2n") pod "7b855517-9e82-425c-ab21-321b63053540" (UID: "7b855517-9e82-425c-ab21-321b63053540"). InnerVolumeSpecName "kube-api-access-7cb2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.422785 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b855517-9e82-425c-ab21-321b63053540" (UID: "7b855517-9e82-425c-ab21-321b63053540"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.430881 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config" (OuterVolumeSpecName: "config") pod "7b855517-9e82-425c-ab21-321b63053540" (UID: "7b855517-9e82-425c-ab21-321b63053540"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.500003 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.500050 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b855517-9e82-425c-ab21-321b63053540-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.500064 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cb2n\" (UniqueName: \"kubernetes.io/projected/7b855517-9e82-425c-ab21-321b63053540-kube-api-access-7cb2n\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.535159 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66vnw" event={"ID":"7b855517-9e82-425c-ab21-321b63053540","Type":"ContainerDied","Data":"dec9b8c8a1d231a2378082a79dc5c95ec9cf01bdcc460ee2aa1c189cd8167c19"} Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.535229 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec9b8c8a1d231a2378082a79dc5c95ec9cf01bdcc460ee2aa1c189cd8167c19" Nov 25 09:19:13 crc kubenswrapper[4565]: I1125 09:19:13.535322 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66vnw" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.435562 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:14 crc kubenswrapper[4565]: E1125 09:19:14.436247 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b855517-9e82-425c-ab21-321b63053540" containerName="neutron-db-sync" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.436274 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b855517-9e82-425c-ab21-321b63053540" containerName="neutron-db-sync" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.436471 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b855517-9e82-425c-ab21-321b63053540" containerName="neutron-db-sync" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.440067 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.462712 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.516430 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfbv\" (UniqueName: \"kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.516486 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.516525 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.516575 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.516637 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.579347 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.581544 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.584436 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.584641 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.584978 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.585448 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgk4c" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.594460 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618670 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfbv\" (UniqueName: \"kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618726 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618750 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618779 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618796 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf28g\" (UniqueName: \"kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618832 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618856 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618889 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618910 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.618987 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.620780 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.621314 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.622498 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.623078 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.640576 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfbv\" (UniqueName: \"kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv\") pod \"dnsmasq-dns-6677d66f85-8rlv9\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.721011 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.721135 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.721172 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf28g\" (UniqueName: \"kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.721229 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.721273 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.725382 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.725737 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.726422 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.728471 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.744878 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf28g\" (UniqueName: \"kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g\") pod \"neutron-b76df5f9b-dzngk\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.754763 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:14 crc kubenswrapper[4565]: I1125 09:19:14.914433 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.013785 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d4d697775-b8wbb"] Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.019544 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.023183 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.024606 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.032599 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d4d697775-b8wbb"] Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072206 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-httpd-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072307 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-ovndb-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072339 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-public-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072430 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-combined-ca-bundle\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072506 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072546 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xfw\" (UniqueName: \"kubernetes.io/projected/aca1b06e-7a63-4a42-873b-427de57cef7f-kube-api-access-d9xfw\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.072615 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-internal-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174528 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-httpd-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174606 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-ovndb-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174628 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-public-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174723 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-combined-ca-bundle\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174802 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174833 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xfw\" (UniqueName: \"kubernetes.io/projected/aca1b06e-7a63-4a42-873b-427de57cef7f-kube-api-access-d9xfw\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.174919 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-internal-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.181721 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-internal-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.181803 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-ovndb-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.183503 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-httpd-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.184501 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-config\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.190370 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-public-tls-certs\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.198989 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca1b06e-7a63-4a42-873b-427de57cef7f-combined-ca-bundle\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.201869 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xfw\" (UniqueName: \"kubernetes.io/projected/aca1b06e-7a63-4a42-873b-427de57cef7f-kube-api-access-d9xfw\") pod \"neutron-6d4d697775-b8wbb\" (UID: \"aca1b06e-7a63-4a42-873b-427de57cef7f\") " pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:17 crc kubenswrapper[4565]: I1125 09:19:17.343426 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.082905 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.143994 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb\") pod \"66545593-8b73-4d2e-9f66-4eb259ef6752\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.144071 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb\") pod \"66545593-8b73-4d2e-9f66-4eb259ef6752\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.144166 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc\") pod \"66545593-8b73-4d2e-9f66-4eb259ef6752\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.145240 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config\") pod \"66545593-8b73-4d2e-9f66-4eb259ef6752\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.145436 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttwlj\" (UniqueName: \"kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj\") pod \"66545593-8b73-4d2e-9f66-4eb259ef6752\" (UID: \"66545593-8b73-4d2e-9f66-4eb259ef6752\") " Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.149753 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj" (OuterVolumeSpecName: "kube-api-access-ttwlj") pod "66545593-8b73-4d2e-9f66-4eb259ef6752" (UID: "66545593-8b73-4d2e-9f66-4eb259ef6752"). InnerVolumeSpecName "kube-api-access-ttwlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.188861 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66545593-8b73-4d2e-9f66-4eb259ef6752" (UID: "66545593-8b73-4d2e-9f66-4eb259ef6752"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.189069 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66545593-8b73-4d2e-9f66-4eb259ef6752" (UID: "66545593-8b73-4d2e-9f66-4eb259ef6752"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.191600 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66545593-8b73-4d2e-9f66-4eb259ef6752" (UID: "66545593-8b73-4d2e-9f66-4eb259ef6752"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.192023 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config" (OuterVolumeSpecName: "config") pod "66545593-8b73-4d2e-9f66-4eb259ef6752" (UID: "66545593-8b73-4d2e-9f66-4eb259ef6752"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.247702 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.247728 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.247740 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.247751 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66545593-8b73-4d2e-9f66-4eb259ef6752-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.247761 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttwlj\" (UniqueName: \"kubernetes.io/projected/66545593-8b73-4d2e-9f66-4eb259ef6752-kube-api-access-ttwlj\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.643763 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d458467-6r4gl" event={"ID":"66545593-8b73-4d2e-9f66-4eb259ef6752","Type":"ContainerDied","Data":"f99c2c3c0e0c86a2019eb76bc6c3963f2406c72e8cee221275969f509f8d4086"} Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.643841 4565 scope.go:117] "RemoveContainer" containerID="88a2e475737f9205d7eddd768e245c3341c0cfe05c417ab65bb93947aa664bcb" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.644027 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d458467-6r4gl" Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.679164 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:19:21 crc kubenswrapper[4565]: I1125 09:19:21.683359 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-569d458467-6r4gl"] Nov 25 09:19:22 crc kubenswrapper[4565]: E1125 09:19:22.266011 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879" Nov 25 09:19:22 crc kubenswrapper[4565]: E1125 09:19:22.267472 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8gv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9gszq_openstack(33e21a69-41ac-4166-bfeb-6ec0eaff7e64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:19:22 crc kubenswrapper[4565]: E1125 09:19:22.269190 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9gszq" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.304672 4565 scope.go:117] "RemoveContainer" containerID="e5876a624ab2421d5cff515ae1a4cc21046d44088906f59cb0968ded0e4bce59" Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.652061 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bs9qc" event={"ID":"fad94c28-1e87-4955-9d65-ca6cfdd4087f","Type":"ContainerStarted","Data":"699588041b9fa9a82019df4339e6f4596a5d5e7c1902dfb383b2b5d2e4aa5a54"} Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.655853 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz4lt" event={"ID":"754ce7b0-5a2a-4206-a901-94010cec0e08","Type":"ContainerStarted","Data":"6b5671d101ba4871ee228dd23daa78162ab92ec583a6f2f91cc5fc0c43c4c9dc"} Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.662792 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerStarted","Data":"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24"} Nov 25 09:19:22 crc kubenswrapper[4565]: E1125 09:19:22.665443 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879\\\"\"" pod="openstack/cinder-db-sync-9gszq" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.676475 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bs9qc" podStartSLOduration=1.552837861 podStartE2EDuration="22.676457896s" podCreationTimestamp="2025-11-25 09:19:00 +0000 UTC" firstStartedPulling="2025-11-25 09:19:01.16617645 +0000 UTC m=+874.368671577" lastFinishedPulling="2025-11-25 09:19:22.289796483 +0000 UTC m=+895.492291612" observedRunningTime="2025-11-25 09:19:22.673259587 +0000 UTC m=+895.875754724" watchObservedRunningTime="2025-11-25 09:19:22.676457896 +0000 UTC m=+895.878953033" Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.720905 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wz4lt" podStartSLOduration=1.727016729 podStartE2EDuration="22.720888986s" podCreationTimestamp="2025-11-25 09:19:00 +0000 UTC" firstStartedPulling="2025-11-25 09:19:01.277939833 +0000 UTC m=+874.480434971" lastFinishedPulling="2025-11-25 09:19:22.27181209 +0000 UTC m=+895.474307228" observedRunningTime="2025-11-25 09:19:22.712503809 +0000 UTC m=+895.914998947" watchObservedRunningTime="2025-11-25 09:19:22.720888986 +0000 UTC m=+895.923384124" Nov 25 09:19:22 crc kubenswrapper[4565]: W1125 09:19:22.796173 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc7a6cc_0061_40f7_a263_0d3fc86ce86e.slice/crio-2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557 WatchSource:0}: Error finding container 2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557: Status 404 returned error can't find the container with id 2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557 Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.801308 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4k5dr"] Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.862518 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:22 crc kubenswrapper[4565]: W1125 09:19:22.880290 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b008af_c21f_44a0_a919_9a79b7b8e26e.slice/crio-2becb3583cf2541863bb9e88129af2fe627730e0a2ab1a070034b4e19ba4c58d WatchSource:0}: Error finding container 2becb3583cf2541863bb9e88129af2fe627730e0a2ab1a070034b4e19ba4c58d: Status 404 returned error can't find the container with id 2becb3583cf2541863bb9e88129af2fe627730e0a2ab1a070034b4e19ba4c58d Nov 25 09:19:22 crc kubenswrapper[4565]: I1125 09:19:22.956492 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d4d697775-b8wbb"] Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.044799 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-569d458467-6r4gl" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.109276 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" path="/var/lib/kubelet/pods/66545593-8b73-4d2e-9f66-4eb259ef6752/volumes" Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.725260 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4k5dr" event={"ID":"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e","Type":"ContainerStarted","Data":"536b7ef9f5243b894bb6a0f1e32b92826ef47e276d76df599efa15830e42ccde"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.725565 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4k5dr" event={"ID":"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e","Type":"ContainerStarted","Data":"2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.729962 4565 generic.go:334] "Generic (PLEG): container finished" podID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerID="96e9a1cae862c2ce2663f7c6ba1915ba6b6985c0fc55645f115c1cadefafd76b" exitCode=0 Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.730030 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" event={"ID":"f1b008af-c21f-44a0-a919-9a79b7b8e26e","Type":"ContainerDied","Data":"96e9a1cae862c2ce2663f7c6ba1915ba6b6985c0fc55645f115c1cadefafd76b"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.730060 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" event={"ID":"f1b008af-c21f-44a0-a919-9a79b7b8e26e","Type":"ContainerStarted","Data":"2becb3583cf2541863bb9e88129af2fe627730e0a2ab1a070034b4e19ba4c58d"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.746800 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4k5dr" podStartSLOduration=16.746778692 podStartE2EDuration="16.746778692s" podCreationTimestamp="2025-11-25 09:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:23.740234565 +0000 UTC m=+896.942729703" watchObservedRunningTime="2025-11-25 09:19:23.746778692 +0000 UTC m=+896.949273830" Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.748295 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4d697775-b8wbb" event={"ID":"aca1b06e-7a63-4a42-873b-427de57cef7f","Type":"ContainerStarted","Data":"00753de4b8f4d38fdb294d4cc4db3ddfeef90e96ee7c95a656b0793b76226a80"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.748330 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4d697775-b8wbb" event={"ID":"aca1b06e-7a63-4a42-873b-427de57cef7f","Type":"ContainerStarted","Data":"e66cb926bdbb83fa9941388b6becc0bd2691bc184650008bc73b0ed6d20adca8"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.748345 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.748354 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4d697775-b8wbb" event={"ID":"aca1b06e-7a63-4a42-873b-427de57cef7f","Type":"ContainerStarted","Data":"f47a8f85c5f9cafcb47738e8a8cf7d5f6bd69a0c6164d3c78491aef5e126ab11"} Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.842478 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d4d697775-b8wbb" podStartSLOduration=7.842447354 podStartE2EDuration="7.842447354s" podCreationTimestamp="2025-11-25 09:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:23.824374423 +0000 UTC m=+897.026869561" watchObservedRunningTime="2025-11-25 09:19:23.842447354 +0000 UTC m=+897.044942492" Nov 25 09:19:23 crc kubenswrapper[4565]: I1125 09:19:23.969540 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.755401 4565 generic.go:334] "Generic (PLEG): container finished" podID="754ce7b0-5a2a-4206-a901-94010cec0e08" containerID="6b5671d101ba4871ee228dd23daa78162ab92ec583a6f2f91cc5fc0c43c4c9dc" exitCode=0 Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.755487 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz4lt" event={"ID":"754ce7b0-5a2a-4206-a901-94010cec0e08","Type":"ContainerDied","Data":"6b5671d101ba4871ee228dd23daa78162ab92ec583a6f2f91cc5fc0c43c4c9dc"} Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.758557 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerStarted","Data":"973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99"} Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.758583 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerStarted","Data":"7e2778b9507db2a090098f46b3d8535fb3445c059d9aa00e260b6fe9bf076525"} Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.758593 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerStarted","Data":"dd4c715d8252ab5be1315e1e80cd94cd8c9bcf306b09abb20a281e8ad214e2a0"} Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.759462 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.771780 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" event={"ID":"f1b008af-c21f-44a0-a919-9a79b7b8e26e","Type":"ContainerStarted","Data":"6d7c1ef1705baee04d4a3c348e526f81df4dcff84af340ec9307ad25780b0664"} Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.771806 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.794593 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b76df5f9b-dzngk" podStartSLOduration=10.794580566 podStartE2EDuration="10.794580566s" podCreationTimestamp="2025-11-25 09:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:24.793381386 +0000 UTC m=+897.995876513" watchObservedRunningTime="2025-11-25 09:19:24.794580566 +0000 UTC m=+897.997075724" Nov 25 09:19:24 crc kubenswrapper[4565]: I1125 09:19:24.807455 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" podStartSLOduration=10.807431532 podStartE2EDuration="10.807431532s" podCreationTimestamp="2025-11-25 09:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:24.805470246 +0000 UTC m=+898.007965383" watchObservedRunningTime="2025-11-25 09:19:24.807431532 +0000 UTC m=+898.009926660" Nov 25 09:19:25 crc kubenswrapper[4565]: I1125 09:19:25.100057 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:19:25 crc kubenswrapper[4565]: I1125 09:19:25.100102 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:19:25 crc kubenswrapper[4565]: I1125 09:19:25.786654 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerStarted","Data":"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514"} Nov 25 09:19:25 crc kubenswrapper[4565]: I1125 09:19:25.789660 4565 generic.go:334] "Generic (PLEG): container finished" podID="fad94c28-1e87-4955-9d65-ca6cfdd4087f" containerID="699588041b9fa9a82019df4339e6f4596a5d5e7c1902dfb383b2b5d2e4aa5a54" exitCode=0 Nov 25 09:19:25 crc kubenswrapper[4565]: I1125 09:19:25.789764 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bs9qc" event={"ID":"fad94c28-1e87-4955-9d65-ca6cfdd4087f","Type":"ContainerDied","Data":"699588041b9fa9a82019df4339e6f4596a5d5e7c1902dfb383b2b5d2e4aa5a54"} Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.155211 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.366362 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts\") pod \"754ce7b0-5a2a-4206-a901-94010cec0e08\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.366996 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs\") pod \"754ce7b0-5a2a-4206-a901-94010cec0e08\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.367144 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8mh2\" (UniqueName: \"kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2\") pod \"754ce7b0-5a2a-4206-a901-94010cec0e08\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.367447 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs" (OuterVolumeSpecName: "logs") pod "754ce7b0-5a2a-4206-a901-94010cec0e08" (UID: "754ce7b0-5a2a-4206-a901-94010cec0e08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.367753 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle\") pod \"754ce7b0-5a2a-4206-a901-94010cec0e08\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.367790 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data\") pod \"754ce7b0-5a2a-4206-a901-94010cec0e08\" (UID: \"754ce7b0-5a2a-4206-a901-94010cec0e08\") " Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.368751 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/754ce7b0-5a2a-4206-a901-94010cec0e08-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.375523 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2" (OuterVolumeSpecName: "kube-api-access-t8mh2") pod "754ce7b0-5a2a-4206-a901-94010cec0e08" (UID: "754ce7b0-5a2a-4206-a901-94010cec0e08"). InnerVolumeSpecName "kube-api-access-t8mh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.387173 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts" (OuterVolumeSpecName: "scripts") pod "754ce7b0-5a2a-4206-a901-94010cec0e08" (UID: "754ce7b0-5a2a-4206-a901-94010cec0e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.404043 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data" (OuterVolumeSpecName: "config-data") pod "754ce7b0-5a2a-4206-a901-94010cec0e08" (UID: "754ce7b0-5a2a-4206-a901-94010cec0e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.411971 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "754ce7b0-5a2a-4206-a901-94010cec0e08" (UID: "754ce7b0-5a2a-4206-a901-94010cec0e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.472299 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8mh2\" (UniqueName: \"kubernetes.io/projected/754ce7b0-5a2a-4206-a901-94010cec0e08-kube-api-access-t8mh2\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.472440 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.472509 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.472561 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/754ce7b0-5a2a-4206-a901-94010cec0e08-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.827842 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz4lt" event={"ID":"754ce7b0-5a2a-4206-a901-94010cec0e08","Type":"ContainerDied","Data":"a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927"} Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.827914 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f93957e635e46ecc8a1815b0eeafcd01223e5524c1672ec218f33c2ec45927" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.828107 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz4lt" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.829418 4565 generic.go:334] "Generic (PLEG): container finished" podID="8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" containerID="536b7ef9f5243b894bb6a0f1e32b92826ef47e276d76df599efa15830e42ccde" exitCode=0 Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.829923 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4k5dr" event={"ID":"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e","Type":"ContainerDied","Data":"536b7ef9f5243b894bb6a0f1e32b92826ef47e276d76df599efa15830e42ccde"} Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882016 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-66b84f9bd8-pdssf"] Nov 25 09:19:26 crc kubenswrapper[4565]: E1125 09:19:26.882518 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882543 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" Nov 25 09:19:26 crc kubenswrapper[4565]: E1125 09:19:26.882565 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754ce7b0-5a2a-4206-a901-94010cec0e08" containerName="placement-db-sync" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882571 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="754ce7b0-5a2a-4206-a901-94010cec0e08" containerName="placement-db-sync" Nov 25 09:19:26 crc kubenswrapper[4565]: E1125 09:19:26.882588 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="init" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882595 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="init" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882840 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="66545593-8b73-4d2e-9f66-4eb259ef6752" containerName="dnsmasq-dns" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.882874 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="754ce7b0-5a2a-4206-a901-94010cec0e08" containerName="placement-db-sync" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.884087 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.887676 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.888128 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ntkbc" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.888452 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.888821 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.888967 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.923024 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66b84f9bd8-pdssf"] Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989222 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-public-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989316 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4jc\" (UniqueName: \"kubernetes.io/projected/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-kube-api-access-hz4jc\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989503 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-logs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989535 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-combined-ca-bundle\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989746 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-config-data\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989920 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-internal-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:26 crc kubenswrapper[4565]: I1125 09:19:26.989981 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-scripts\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091209 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-internal-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091258 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-scripts\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091355 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-public-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091377 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4jc\" (UniqueName: \"kubernetes.io/projected/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-kube-api-access-hz4jc\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091419 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-logs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091439 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-combined-ca-bundle\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.091467 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-config-data\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.092376 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-logs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.099461 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.099587 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.100039 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.100045 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.126217 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-internal-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.133156 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-public-tls-certs\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.134524 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4jc\" (UniqueName: \"kubernetes.io/projected/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-kube-api-access-hz4jc\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.138473 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-scripts\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.140003 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-config-data\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.140746 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d87b83a-02e7-48d5-8f87-e127fe8ffe0b-combined-ca-bundle\") pod \"placement-66b84f9bd8-pdssf\" (UID: \"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b\") " pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:27 crc kubenswrapper[4565]: E1125 09:19:27.177650 4565 info.go:109] Failed to get network devices: open /sys/class/net/2574258413dbde7/address: no such file or directory Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.208595 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ntkbc" Nov 25 09:19:27 crc kubenswrapper[4565]: I1125 09:19:27.243662 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.807568 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.857303 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4k5dr" event={"ID":"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e","Type":"ContainerDied","Data":"2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557"} Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.857355 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1afd7d7fc82d9453029d98898fc605aa422cf08a2b0be350ef57913925c557" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.857444 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4k5dr" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937121 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hht65\" (UniqueName: \"kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937192 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937234 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937360 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937403 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.937509 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data\") pod \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\" (UID: \"8dc7a6cc-0061-40f7-a263-0d3fc86ce86e\") " Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.943703 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts" (OuterVolumeSpecName: "scripts") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.943791 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.946213 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65" (OuterVolumeSpecName: "kube-api-access-hht65") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "kube-api-access-hht65". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.950155 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.962478 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data" (OuterVolumeSpecName: "config-data") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:28 crc kubenswrapper[4565]: I1125 09:19:28.962610 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" (UID: "8dc7a6cc-0061-40f7-a263-0d3fc86ce86e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.035579 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-757f65c548-72sgl"] Nov 25 09:19:29 crc kubenswrapper[4565]: E1125 09:19:29.035996 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" containerName="keystone-bootstrap" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.036015 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" containerName="keystone-bootstrap" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.036184 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" containerName="keystone-bootstrap" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.036757 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039794 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-fernet-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039844 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-credential-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039883 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-config-data\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039890 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039941 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-combined-ca-bundle\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.039974 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-scripts\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040030 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbxz\" (UniqueName: \"kubernetes.io/projected/187eb1ab-d8ed-4506-bc95-59cb1f61e285-kube-api-access-mzbxz\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040097 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040169 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-public-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040195 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-internal-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040435 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040450 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hht65\" (UniqueName: \"kubernetes.io/projected/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-kube-api-access-hht65\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040459 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040467 4565 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040477 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.040485 4565 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.062679 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-757f65c548-72sgl"] Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142286 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-public-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142330 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-internal-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142391 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-fernet-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142417 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-credential-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142454 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-config-data\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142501 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-combined-ca-bundle\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142520 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-scripts\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.142573 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbxz\" (UniqueName: \"kubernetes.io/projected/187eb1ab-d8ed-4506-bc95-59cb1f61e285-kube-api-access-mzbxz\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.147130 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-combined-ca-bundle\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.148320 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-scripts\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.148777 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-public-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.149157 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-internal-tls-certs\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.151418 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-fernet-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.158033 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbxz\" (UniqueName: \"kubernetes.io/projected/187eb1ab-d8ed-4506-bc95-59cb1f61e285-kube-api-access-mzbxz\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.162397 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-credential-keys\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.165611 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187eb1ab-d8ed-4506-bc95-59cb1f61e285-config-data\") pod \"keystone-757f65c548-72sgl\" (UID: \"187eb1ab-d8ed-4506-bc95-59cb1f61e285\") " pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.356107 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.787092 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.905663 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:29 crc kubenswrapper[4565]: I1125 09:19:29.920211 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="dnsmasq-dns" containerID="cri-o://c6291dff42c38991e0c84554070fc41a1701223885d0eb0a69034b4b89e7f0fe" gracePeriod=10 Nov 25 09:19:30 crc kubenswrapper[4565]: I1125 09:19:30.505743 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Nov 25 09:19:30 crc kubenswrapper[4565]: I1125 09:19:30.903801 4565 generic.go:334] "Generic (PLEG): container finished" podID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerID="c6291dff42c38991e0c84554070fc41a1701223885d0eb0a69034b4b89e7f0fe" exitCode=0 Nov 25 09:19:30 crc kubenswrapper[4565]: I1125 09:19:30.903852 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" event={"ID":"82113c4e-4aec-4aed-8edf-8918d49c64de","Type":"ContainerDied","Data":"c6291dff42c38991e0c84554070fc41a1701223885d0eb0a69034b4b89e7f0fe"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.127111 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.281242 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.310057 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle\") pod \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.310097 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data\") pod \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.310155 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqqn\" (UniqueName: \"kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn\") pod \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\" (UID: \"fad94c28-1e87-4955-9d65-ca6cfdd4087f\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.321067 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fad94c28-1e87-4955-9d65-ca6cfdd4087f" (UID: "fad94c28-1e87-4955-9d65-ca6cfdd4087f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.321315 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn" (OuterVolumeSpecName: "kube-api-access-2sqqn") pod "fad94c28-1e87-4955-9d65-ca6cfdd4087f" (UID: "fad94c28-1e87-4955-9d65-ca6cfdd4087f"). InnerVolumeSpecName "kube-api-access-2sqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.343998 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fad94c28-1e87-4955-9d65-ca6cfdd4087f" (UID: "fad94c28-1e87-4955-9d65-ca6cfdd4087f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413089 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config\") pod \"82113c4e-4aec-4aed-8edf-8918d49c64de\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413294 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb\") pod \"82113c4e-4aec-4aed-8edf-8918d49c64de\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413412 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb\") pod \"82113c4e-4aec-4aed-8edf-8918d49c64de\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413475 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc\") pod \"82113c4e-4aec-4aed-8edf-8918d49c64de\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413511 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6gjz\" (UniqueName: \"kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz\") pod \"82113c4e-4aec-4aed-8edf-8918d49c64de\" (UID: \"82113c4e-4aec-4aed-8edf-8918d49c64de\") " Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.413991 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.414009 4565 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fad94c28-1e87-4955-9d65-ca6cfdd4087f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.414018 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqqn\" (UniqueName: \"kubernetes.io/projected/fad94c28-1e87-4955-9d65-ca6cfdd4087f-kube-api-access-2sqqn\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.418879 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz" (OuterVolumeSpecName: "kube-api-access-j6gjz") pod "82113c4e-4aec-4aed-8edf-8918d49c64de" (UID: "82113c4e-4aec-4aed-8edf-8918d49c64de"). InnerVolumeSpecName "kube-api-access-j6gjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.463028 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82113c4e-4aec-4aed-8edf-8918d49c64de" (UID: "82113c4e-4aec-4aed-8edf-8918d49c64de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.463958 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config" (OuterVolumeSpecName: "config") pod "82113c4e-4aec-4aed-8edf-8918d49c64de" (UID: "82113c4e-4aec-4aed-8edf-8918d49c64de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.466987 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82113c4e-4aec-4aed-8edf-8918d49c64de" (UID: "82113c4e-4aec-4aed-8edf-8918d49c64de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.472441 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82113c4e-4aec-4aed-8edf-8918d49c64de" (UID: "82113c4e-4aec-4aed-8edf-8918d49c64de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.516953 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6gjz\" (UniqueName: \"kubernetes.io/projected/82113c4e-4aec-4aed-8edf-8918d49c64de-kube-api-access-j6gjz\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.516996 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.517010 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.517023 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.517032 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82113c4e-4aec-4aed-8edf-8918d49c64de-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.592855 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-757f65c548-72sgl"] Nov 25 09:19:31 crc kubenswrapper[4565]: W1125 09:19:31.600636 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187eb1ab_d8ed_4506_bc95_59cb1f61e285.slice/crio-df6efea1099ee695bcf4a98ebb9ad3b3e00822d862a12f5ae1fdbbc5ae0d9c8d WatchSource:0}: Error finding container df6efea1099ee695bcf4a98ebb9ad3b3e00822d862a12f5ae1fdbbc5ae0d9c8d: Status 404 returned error can't find the container with id df6efea1099ee695bcf4a98ebb9ad3b3e00822d862a12f5ae1fdbbc5ae0d9c8d Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.657707 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66b84f9bd8-pdssf"] Nov 25 09:19:31 crc kubenswrapper[4565]: W1125 09:19:31.661745 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d87b83a_02e7_48d5_8f87_e127fe8ffe0b.slice/crio-ec9336cbe1f19482be751654a238409eeffef81bed89598ac01e4249355dea46 WatchSource:0}: Error finding container ec9336cbe1f19482be751654a238409eeffef81bed89598ac01e4249355dea46: Status 404 returned error can't find the container with id ec9336cbe1f19482be751654a238409eeffef81bed89598ac01e4249355dea46 Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.916660 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" event={"ID":"82113c4e-4aec-4aed-8edf-8918d49c64de","Type":"ContainerDied","Data":"a958588604917f26db6b4079bfd97435a61ce1fe3153859bcd32fb537fd844f6"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.917007 4565 scope.go:117] "RemoveContainer" containerID="c6291dff42c38991e0c84554070fc41a1701223885d0eb0a69034b4b89e7f0fe" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.917262 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4bdbdb7-p7zfn" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.918441 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-757f65c548-72sgl" event={"ID":"187eb1ab-d8ed-4506-bc95-59cb1f61e285","Type":"ContainerStarted","Data":"0ee118eaf65c25cd011463db298ad9c7721c6fcec2c7f96a3b758050c811bea8"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.918503 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-757f65c548-72sgl" event={"ID":"187eb1ab-d8ed-4506-bc95-59cb1f61e285","Type":"ContainerStarted","Data":"df6efea1099ee695bcf4a98ebb9ad3b3e00822d862a12f5ae1fdbbc5ae0d9c8d"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.918665 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.928736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerStarted","Data":"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.931852 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bs9qc" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.932014 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bs9qc" event={"ID":"fad94c28-1e87-4955-9d65-ca6cfdd4087f","Type":"ContainerDied","Data":"2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.932196 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2574258413dbde769160c74d33162c844803ce5ce095998dc3012e830d92faf4" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.950568 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b84f9bd8-pdssf" event={"ID":"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b","Type":"ContainerStarted","Data":"717ff24d9c59f2b0190301124b0c3eff51ece8935429a9b6d496db0474460fd7"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.950641 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b84f9bd8-pdssf" event={"ID":"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b","Type":"ContainerStarted","Data":"ec9336cbe1f19482be751654a238409eeffef81bed89598ac01e4249355dea46"} Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.961916 4565 scope.go:117] "RemoveContainer" containerID="7ba29dbbe591c43f090eb6e1e2fb9f59fc83f7e19bbdb0720daaa60758cbe31d" Nov 25 09:19:31 crc kubenswrapper[4565]: I1125 09:19:31.980986 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-757f65c548-72sgl" podStartSLOduration=2.980966731 podStartE2EDuration="2.980966731s" podCreationTimestamp="2025-11-25 09:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:31.939339839 +0000 UTC m=+905.141834976" watchObservedRunningTime="2025-11-25 09:19:31.980966731 +0000 UTC m=+905.183461869" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.016412 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.022041 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66f4bdbdb7-p7zfn"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.357874 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58b56d8886-ghgt5"] Nov 25 09:19:32 crc kubenswrapper[4565]: E1125 09:19:32.358449 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="init" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.358463 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="init" Nov 25 09:19:32 crc kubenswrapper[4565]: E1125 09:19:32.358476 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="dnsmasq-dns" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.358481 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="dnsmasq-dns" Nov 25 09:19:32 crc kubenswrapper[4565]: E1125 09:19:32.358492 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad94c28-1e87-4955-9d65-ca6cfdd4087f" containerName="barbican-db-sync" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.358497 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad94c28-1e87-4955-9d65-ca6cfdd4087f" containerName="barbican-db-sync" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.358634 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" containerName="dnsmasq-dns" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.358677 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad94c28-1e87-4955-9d65-ca6cfdd4087f" containerName="barbican-db-sync" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.359396 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.362169 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kgbtq" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.366780 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.366814 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.374590 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58b56d8886-ghgt5"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.438363 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.451920 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-66898497b9-mtk55"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.452044 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.454455 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.459878 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.479960 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.498714 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-66898497b9-mtk55"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.541805 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data-custom\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.541942 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phh2\" (UniqueName: \"kubernetes.io/projected/b3377fa1-4b60-4b8c-9efb-a266d872af91-kube-api-access-5phh2\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.541998 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-combined-ca-bundle\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.542125 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.548025 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3377fa1-4b60-4b8c-9efb-a266d872af91-logs\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.609312 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.610725 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.618135 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.633227 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650127 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-logs\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650181 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650214 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650254 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650309 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nfmw\" (UniqueName: \"kubernetes.io/projected/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-kube-api-access-4nfmw\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650388 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data-custom\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650426 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-combined-ca-bundle\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650473 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650516 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phh2\" (UniqueName: \"kubernetes.io/projected/b3377fa1-4b60-4b8c-9efb-a266d872af91-kube-api-access-5phh2\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650591 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-combined-ca-bundle\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650618 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650732 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfh78\" (UniqueName: \"kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650771 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7v4\" (UniqueName: \"kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650832 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.650965 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651070 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3377fa1-4b60-4b8c-9efb-a266d872af91-logs\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651185 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651212 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651357 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data-custom\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.651452 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3377fa1-4b60-4b8c-9efb-a266d872af91-logs\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.656434 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.665982 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-config-data-custom\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.671281 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3377fa1-4b60-4b8c-9efb-a266d872af91-combined-ca-bundle\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.675801 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phh2\" (UniqueName: \"kubernetes.io/projected/b3377fa1-4b60-4b8c-9efb-a266d872af91-kube-api-access-5phh2\") pod \"barbican-keystone-listener-58b56d8886-ghgt5\" (UID: \"b3377fa1-4b60-4b8c-9efb-a266d872af91\") " pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.752822 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data-custom\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.752873 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-combined-ca-bundle\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.752911 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.753000 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754018 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754086 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754292 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfh78\" (UniqueName: \"kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754382 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7v4\" (UniqueName: \"kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754486 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754666 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754788 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754821 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.754988 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-logs\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.755022 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.755055 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.755099 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.755123 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nfmw\" (UniqueName: \"kubernetes.io/projected/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-kube-api-access-4nfmw\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.756483 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.760754 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.761827 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.762347 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-logs\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.770657 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data-custom\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.773836 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-combined-ca-bundle\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.781428 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.782381 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfh78\" (UniqueName: \"kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78\") pod \"dnsmasq-dns-844b557b9c-554f5\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.783396 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.784823 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.785313 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nfmw\" (UniqueName: \"kubernetes.io/projected/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-kube-api-access-4nfmw\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.792093 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff2dc5b-9c32-43bc-b8c7-7341812d4160-config-data\") pod \"barbican-worker-66898497b9-mtk55\" (UID: \"5ff2dc5b-9c32-43bc-b8c7-7341812d4160\") " pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.793404 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7v4\" (UniqueName: \"kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4\") pod \"barbican-api-597dbd885b-gqpd6\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.807083 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.825513 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-66898497b9-mtk55" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.934015 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.975095 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.990350 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b84f9bd8-pdssf" event={"ID":"7d87b83a-02e7-48d5-8f87-e127fe8ffe0b","Type":"ContainerStarted","Data":"44541fd67c12cc5d2d2ea01447f5e1f0531f35071897a62ea72b6f1082e551da"} Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.990403 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:32 crc kubenswrapper[4565]: I1125 09:19:32.990429 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.060971 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-66b84f9bd8-pdssf" podStartSLOduration=7.060951165 podStartE2EDuration="7.060951165s" podCreationTimestamp="2025-11-25 09:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:33.03141802 +0000 UTC m=+906.233913158" watchObservedRunningTime="2025-11-25 09:19:33.060951165 +0000 UTC m=+906.263446304" Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.126680 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82113c4e-4aec-4aed-8edf-8918d49c64de" path="/var/lib/kubelet/pods/82113c4e-4aec-4aed-8edf-8918d49c64de/volumes" Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.361523 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.379650 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-66898497b9-mtk55"] Nov 25 09:19:33 crc kubenswrapper[4565]: W1125 09:19:33.400872 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff2dc5b_9c32_43bc_b8c7_7341812d4160.slice/crio-e4f1fbce4d878b76423a8acc83b31d74d65ce987b42ee785987eecc597c7749d WatchSource:0}: Error finding container e4f1fbce4d878b76423a8acc83b31d74d65ce987b42ee785987eecc597c7749d: Status 404 returned error can't find the container with id e4f1fbce4d878b76423a8acc83b31d74d65ce987b42ee785987eecc597c7749d Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.659170 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58b56d8886-ghgt5"] Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.720039 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:33 crc kubenswrapper[4565]: W1125 09:19:33.735631 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1403724f_a8aa_495c_9c1d_1e7a3eccb889.slice/crio-d2929054be334019c0ac10900517f1c395d0e5064e4f6248e6c9c2c84495acd9 WatchSource:0}: Error finding container d2929054be334019c0ac10900517f1c395d0e5064e4f6248e6c9c2c84495acd9: Status 404 returned error can't find the container with id d2929054be334019c0ac10900517f1c395d0e5064e4f6248e6c9c2c84495acd9 Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.998665 4565 generic.go:334] "Generic (PLEG): container finished" podID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerID="64290a60338d32e576ac585404b1e30432b1127a09f36bcab6c153dd64079490" exitCode=0 Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.998781 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b557b9c-554f5" event={"ID":"578e3cd4-5b76-4685-ad96-81427aff9bca","Type":"ContainerDied","Data":"64290a60338d32e576ac585404b1e30432b1127a09f36bcab6c153dd64079490"} Nov 25 09:19:33 crc kubenswrapper[4565]: I1125 09:19:33.998817 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b557b9c-554f5" event={"ID":"578e3cd4-5b76-4685-ad96-81427aff9bca","Type":"ContainerStarted","Data":"19157cc860a6625999a631401bbea898f9ef6db4c06d969931e6abe0d243d144"} Nov 25 09:19:34 crc kubenswrapper[4565]: I1125 09:19:34.014202 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerStarted","Data":"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743"} Nov 25 09:19:34 crc kubenswrapper[4565]: I1125 09:19:34.014267 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerStarted","Data":"d2929054be334019c0ac10900517f1c395d0e5064e4f6248e6c9c2c84495acd9"} Nov 25 09:19:34 crc kubenswrapper[4565]: I1125 09:19:34.034497 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" event={"ID":"b3377fa1-4b60-4b8c-9efb-a266d872af91","Type":"ContainerStarted","Data":"661a9184a1088b43eb724f6e9ca47d58fad7a0f14497b574e771d1218c19cbec"} Nov 25 09:19:34 crc kubenswrapper[4565]: I1125 09:19:34.039158 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66898497b9-mtk55" event={"ID":"5ff2dc5b-9c32-43bc-b8c7-7341812d4160","Type":"ContainerStarted","Data":"e4f1fbce4d878b76423a8acc83b31d74d65ce987b42ee785987eecc597c7749d"} Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.047619 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b557b9c-554f5" event={"ID":"578e3cd4-5b76-4685-ad96-81427aff9bca","Type":"ContainerStarted","Data":"558bfa667fe8afb5c253572686950fce1d3220bf2ccd8d52dead554f3ab0d74c"} Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.047708 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.051423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerStarted","Data":"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d"} Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.051632 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.071306 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844b557b9c-554f5" podStartSLOduration=3.071285857 podStartE2EDuration="3.071285857s" podCreationTimestamp="2025-11-25 09:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:35.067598656 +0000 UTC m=+908.270093794" watchObservedRunningTime="2025-11-25 09:19:35.071285857 +0000 UTC m=+908.273780995" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.090340 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-597dbd885b-gqpd6" podStartSLOduration=3.090234829 podStartE2EDuration="3.090234829s" podCreationTimestamp="2025-11-25 09:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:35.088223647 +0000 UTC m=+908.290718776" watchObservedRunningTime="2025-11-25 09:19:35.090234829 +0000 UTC m=+908.292729967" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.395790 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b4c469f64-4jfxz"] Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.397210 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.399047 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.399448 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.416517 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b4c469f64-4jfxz"] Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446190 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-internal-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446592 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b577976e-309e-47cd-80a6-4f72547d912b-logs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446628 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-combined-ca-bundle\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446658 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data-custom\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446707 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfv99\" (UniqueName: \"kubernetes.io/projected/b577976e-309e-47cd-80a6-4f72547d912b-kube-api-access-bfv99\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.446855 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-public-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548330 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-public-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548424 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-internal-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548498 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b577976e-309e-47cd-80a6-4f72547d912b-logs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548523 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-combined-ca-bundle\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548540 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data-custom\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548571 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.548594 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfv99\" (UniqueName: \"kubernetes.io/projected/b577976e-309e-47cd-80a6-4f72547d912b-kube-api-access-bfv99\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.549199 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b577976e-309e-47cd-80a6-4f72547d912b-logs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.555780 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-combined-ca-bundle\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.555821 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data-custom\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.556286 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-config-data\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.557440 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-public-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.557589 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b577976e-309e-47cd-80a6-4f72547d912b-internal-tls-certs\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.565283 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfv99\" (UniqueName: \"kubernetes.io/projected/b577976e-309e-47cd-80a6-4f72547d912b-kube-api-access-bfv99\") pod \"barbican-api-6b4c469f64-4jfxz\" (UID: \"b577976e-309e-47cd-80a6-4f72547d912b\") " pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:35 crc kubenswrapper[4565]: I1125 09:19:35.717941 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.096292 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" event={"ID":"b3377fa1-4b60-4b8c-9efb-a266d872af91","Type":"ContainerStarted","Data":"5f27b692480de70c5adf4087379e79a5a008b60d1407fcf85592670210403f2d"} Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.096583 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" event={"ID":"b3377fa1-4b60-4b8c-9efb-a266d872af91","Type":"ContainerStarted","Data":"0d3e391955b5a64619d950b0bd1d0c74406e001238907be55e93b9868bef3a8b"} Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.105416 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66898497b9-mtk55" event={"ID":"5ff2dc5b-9c32-43bc-b8c7-7341812d4160","Type":"ContainerStarted","Data":"cbff8edce60e652479475df0091d06596f32f998c2ff4a35f1f7320f7fb73d4e"} Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.105444 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66898497b9-mtk55" event={"ID":"5ff2dc5b-9c32-43bc-b8c7-7341812d4160","Type":"ContainerStarted","Data":"c1cb3dd4754680e8ea907e8668ba8d2a9ecaba380f7d8f4c5f1d3854ad0aaf11"} Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.105824 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.123178 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58b56d8886-ghgt5" podStartSLOduration=2.427985627 podStartE2EDuration="4.123165158s" podCreationTimestamp="2025-11-25 09:19:32 +0000 UTC" firstStartedPulling="2025-11-25 09:19:33.664009213 +0000 UTC m=+906.866504350" lastFinishedPulling="2025-11-25 09:19:35.359188753 +0000 UTC m=+908.561683881" observedRunningTime="2025-11-25 09:19:36.116382522 +0000 UTC m=+909.318877660" watchObservedRunningTime="2025-11-25 09:19:36.123165158 +0000 UTC m=+909.325660296" Nov 25 09:19:36 crc kubenswrapper[4565]: I1125 09:19:36.141406 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-66898497b9-mtk55" podStartSLOduration=2.187029653 podStartE2EDuration="4.141385676s" podCreationTimestamp="2025-11-25 09:19:32 +0000 UTC" firstStartedPulling="2025-11-25 09:19:33.402782686 +0000 UTC m=+906.605277824" lastFinishedPulling="2025-11-25 09:19:35.357138709 +0000 UTC m=+908.559633847" observedRunningTime="2025-11-25 09:19:36.132024829 +0000 UTC m=+909.334519967" watchObservedRunningTime="2025-11-25 09:19:36.141385676 +0000 UTC m=+909.343880815" Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.047973 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b4c469f64-4jfxz"] Nov 25 09:19:41 crc kubenswrapper[4565]: W1125 09:19:41.058464 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb577976e_309e_47cd_80a6_4f72547d912b.slice/crio-4906b07418c3b53e2a86f6530188026d41d861791c1576190cd6c339f264e920 WatchSource:0}: Error finding container 4906b07418c3b53e2a86f6530188026d41d861791c1576190cd6c339f264e920: Status 404 returned error can't find the container with id 4906b07418c3b53e2a86f6530188026d41d861791c1576190cd6c339f264e920 Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.180490 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerStarted","Data":"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20"} Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.180956 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-central-agent" containerID="cri-o://709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24" gracePeriod=30 Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.181084 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.181434 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="sg-core" containerID="cri-o://dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62" gracePeriod=30 Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.181473 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-notification-agent" containerID="cri-o://c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514" gracePeriod=30 Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.181446 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="proxy-httpd" containerID="cri-o://f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20" gracePeriod=30 Nov 25 09:19:41 crc kubenswrapper[4565]: I1125 09:19:41.188837 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c469f64-4jfxz" event={"ID":"b577976e-309e-47cd-80a6-4f72547d912b","Type":"ContainerStarted","Data":"4906b07418c3b53e2a86f6530188026d41d861791c1576190cd6c339f264e920"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200216 4565 generic.go:334] "Generic (PLEG): container finished" podID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerID="f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20" exitCode=0 Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200565 4565 generic.go:334] "Generic (PLEG): container finished" podID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerID="dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62" exitCode=2 Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200576 4565 generic.go:334] "Generic (PLEG): container finished" podID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerID="709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24" exitCode=0 Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200618 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerDied","Data":"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200650 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerDied","Data":"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.200661 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerDied","Data":"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.202061 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9gszq" event={"ID":"33e21a69-41ac-4166-bfeb-6ec0eaff7e64","Type":"ContainerStarted","Data":"ecb74bbb60f1cbd0cbbeb226b021ff41c8730bf7f9a589902f92c49fe9f8872e"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.206284 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c469f64-4jfxz" event={"ID":"b577976e-309e-47cd-80a6-4f72547d912b","Type":"ContainerStarted","Data":"a5f22a2de54e40f1c24320206ee6dcbcb04a7b1277598173e108cef309031c06"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.206417 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c469f64-4jfxz" event={"ID":"b577976e-309e-47cd-80a6-4f72547d912b","Type":"ContainerStarted","Data":"af126d524c285902dc139f450c4a1e6f252daf875f4f6944ed52527d9ca6ac59"} Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.206749 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.206867 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.226117 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9gszq" podStartSLOduration=3.577311479 podStartE2EDuration="43.22610354s" podCreationTimestamp="2025-11-25 09:18:59 +0000 UTC" firstStartedPulling="2025-11-25 09:19:01.030024079 +0000 UTC m=+874.232519218" lastFinishedPulling="2025-11-25 09:19:40.678816141 +0000 UTC m=+913.881311279" observedRunningTime="2025-11-25 09:19:42.224579386 +0000 UTC m=+915.427074514" watchObservedRunningTime="2025-11-25 09:19:42.22610354 +0000 UTC m=+915.428598679" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.227865 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.276139577 podStartE2EDuration="43.227855753s" podCreationTimestamp="2025-11-25 09:18:59 +0000 UTC" firstStartedPulling="2025-11-25 09:19:00.744582274 +0000 UTC m=+873.947077411" lastFinishedPulling="2025-11-25 09:19:40.696298459 +0000 UTC m=+913.898793587" observedRunningTime="2025-11-25 09:19:41.210081687 +0000 UTC m=+914.412576825" watchObservedRunningTime="2025-11-25 09:19:42.227855753 +0000 UTC m=+915.430350890" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.248908 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b4c469f64-4jfxz" podStartSLOduration=7.248896058 podStartE2EDuration="7.248896058s" podCreationTimestamp="2025-11-25 09:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:42.245463387 +0000 UTC m=+915.447958524" watchObservedRunningTime="2025-11-25 09:19:42.248896058 +0000 UTC m=+915.451391196" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.809062 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.871137 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:42 crc kubenswrapper[4565]: I1125 09:19:42.871971 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="dnsmasq-dns" containerID="cri-o://6d7c1ef1705baee04d4a3c348e526f81df4dcff84af340ec9307ad25780b0664" gracePeriod=10 Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.226089 4565 generic.go:334] "Generic (PLEG): container finished" podID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerID="6d7c1ef1705baee04d4a3c348e526f81df4dcff84af340ec9307ad25780b0664" exitCode=0 Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.226248 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" event={"ID":"f1b008af-c21f-44a0-a919-9a79b7b8e26e","Type":"ContainerDied","Data":"6d7c1ef1705baee04d4a3c348e526f81df4dcff84af340ec9307ad25780b0664"} Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.465993 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.553569 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc\") pod \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.553650 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb\") pod \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.553730 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb\") pod \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.553798 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config\") pod \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.553838 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nfbv\" (UniqueName: \"kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv\") pod \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\" (UID: \"f1b008af-c21f-44a0-a919-9a79b7b8e26e\") " Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.593420 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv" (OuterVolumeSpecName: "kube-api-access-9nfbv") pod "f1b008af-c21f-44a0-a919-9a79b7b8e26e" (UID: "f1b008af-c21f-44a0-a919-9a79b7b8e26e"). InnerVolumeSpecName "kube-api-access-9nfbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.631917 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1b008af-c21f-44a0-a919-9a79b7b8e26e" (UID: "f1b008af-c21f-44a0-a919-9a79b7b8e26e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.647367 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config" (OuterVolumeSpecName: "config") pod "f1b008af-c21f-44a0-a919-9a79b7b8e26e" (UID: "f1b008af-c21f-44a0-a919-9a79b7b8e26e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.655410 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1b008af-c21f-44a0-a919-9a79b7b8e26e" (UID: "f1b008af-c21f-44a0-a919-9a79b7b8e26e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.657655 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.657686 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.657714 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.657727 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nfbv\" (UniqueName: \"kubernetes.io/projected/f1b008af-c21f-44a0-a919-9a79b7b8e26e-kube-api-access-9nfbv\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.672444 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1b008af-c21f-44a0-a919-9a79b7b8e26e" (UID: "f1b008af-c21f-44a0-a919-9a79b7b8e26e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:43 crc kubenswrapper[4565]: I1125 09:19:43.760878 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1b008af-c21f-44a0-a919-9a79b7b8e26e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.239908 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" event={"ID":"f1b008af-c21f-44a0-a919-9a79b7b8e26e","Type":"ContainerDied","Data":"2becb3583cf2541863bb9e88129af2fe627730e0a2ab1a070034b4e19ba4c58d"} Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.240007 4565 scope.go:117] "RemoveContainer" containerID="6d7c1ef1705baee04d4a3c348e526f81df4dcff84af340ec9307ad25780b0664" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.240115 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6677d66f85-8rlv9" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.262012 4565 scope.go:117] "RemoveContainer" containerID="96e9a1cae862c2ce2663f7c6ba1915ba6b6985c0fc55645f115c1cadefafd76b" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.284373 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.292706 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6677d66f85-8rlv9"] Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.436860 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.551191 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:44 crc kubenswrapper[4565]: I1125 09:19:44.929099 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.111667 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" path="/var/lib/kubelet/pods/f1b008af-c21f-44a0-a919-9a79b7b8e26e/volumes" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.254840 4565 generic.go:334] "Generic (PLEG): container finished" podID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" containerID="ecb74bbb60f1cbd0cbbeb226b021ff41c8730bf7f9a589902f92c49fe9f8872e" exitCode=0 Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.254943 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9gszq" event={"ID":"33e21a69-41ac-4166-bfeb-6ec0eaff7e64","Type":"ContainerDied","Data":"ecb74bbb60f1cbd0cbbeb226b021ff41c8730bf7f9a589902f92c49fe9f8872e"} Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.778852 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804357 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804723 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804766 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804851 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804905 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p24l\" (UniqueName: \"kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804976 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.804994 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts\") pod \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\" (UID: \"bcc2a2ff-50de-4bb9-9581-3868db5ec59e\") " Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.805293 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.805549 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.809484 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.831129 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts" (OuterVolumeSpecName: "scripts") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.841584 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l" (OuterVolumeSpecName: "kube-api-access-5p24l") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "kube-api-access-5p24l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.865979 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.907711 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.907737 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p24l\" (UniqueName: \"kubernetes.io/projected/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-kube-api-access-5p24l\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.907750 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.907759 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.917007 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:45 crc kubenswrapper[4565]: I1125 09:19:45.917515 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data" (OuterVolumeSpecName: "config-data") pod "bcc2a2ff-50de-4bb9-9581-3868db5ec59e" (UID: "bcc2a2ff-50de-4bb9-9581-3868db5ec59e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.009291 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.009326 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc2a2ff-50de-4bb9-9581-3868db5ec59e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.268055 4565 generic.go:334] "Generic (PLEG): container finished" podID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerID="c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514" exitCode=0 Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.268396 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.273292 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerDied","Data":"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514"} Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.273349 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc2a2ff-50de-4bb9-9581-3868db5ec59e","Type":"ContainerDied","Data":"8a86dabb0127021e49202ea997988b534802f23f7473261464a13dec5c45ff68"} Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.273374 4565 scope.go:117] "RemoveContainer" containerID="f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.305996 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.334971 4565 scope.go:117] "RemoveContainer" containerID="dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.336753 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.346488 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352224 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-central-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352253 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-central-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352267 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-notification-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352318 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-notification-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352327 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="proxy-httpd" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352334 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="proxy-httpd" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352345 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="init" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352351 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="init" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352362 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="dnsmasq-dns" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352387 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="dnsmasq-dns" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.352430 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="sg-core" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352438 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="sg-core" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352741 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="sg-core" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352775 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-notification-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352785 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="ceilometer-central-agent" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352793 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" containerName="proxy-httpd" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.352808 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b008af-c21f-44a0-a919-9a79b7b8e26e" containerName="dnsmasq-dns" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.355646 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.355783 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.358903 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.359167 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.368504 4565 scope.go:117] "RemoveContainer" containerID="c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.424116 4565 scope.go:117] "RemoveContainer" containerID="709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438210 4565 scope.go:117] "RemoveContainer" containerID="f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.438565 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20\": container with ID starting with f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20 not found: ID does not exist" containerID="f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438593 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwdf\" (UniqueName: \"kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438609 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20"} err="failed to get container status \"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20\": rpc error: code = NotFound desc = could not find container \"f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20\": container with ID starting with f4aa0b4fdd4285efbfd089ba651297c4b82279eb1f3e300aa3aa255fc68ddb20 not found: ID does not exist" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438642 4565 scope.go:117] "RemoveContainer" containerID="dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438654 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438679 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438809 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.438876 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.439346 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.439410 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.439830 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62\": container with ID starting with dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62 not found: ID does not exist" containerID="dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.439857 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62"} err="failed to get container status \"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62\": rpc error: code = NotFound desc = could not find container \"dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62\": container with ID starting with dca5cbb374d88e55d6dd3d463b50fe1e12445a46093c082e97d38ec2b414da62 not found: ID does not exist" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.439872 4565 scope.go:117] "RemoveContainer" containerID="c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.440346 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514\": container with ID starting with c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514 not found: ID does not exist" containerID="c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.440392 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514"} err="failed to get container status \"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514\": rpc error: code = NotFound desc = could not find container \"c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514\": container with ID starting with c5d79a00cf67f858e39faa1faa6bf83a39983cf0cd38c94623de28267d33c514 not found: ID does not exist" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.440473 4565 scope.go:117] "RemoveContainer" containerID="709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24" Nov 25 09:19:46 crc kubenswrapper[4565]: E1125 09:19:46.440805 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24\": container with ID starting with 709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24 not found: ID does not exist" containerID="709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.440837 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24"} err="failed to get container status \"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24\": rpc error: code = NotFound desc = could not find container \"709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24\": container with ID starting with 709bfeb620fb62ca2b2a19ab24e5f640a81754e8c8efa9c9a53b4a007a981e24 not found: ID does not exist" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.540874 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwdf\" (UniqueName: \"kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.540996 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541027 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541046 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541079 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541197 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541237 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.541673 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.542114 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.547442 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.548438 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.549378 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.550498 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.563806 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwdf\" (UniqueName: \"kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf\") pod \"ceilometer-0\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.621073 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.673306 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.745582 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8gv\" (UniqueName: \"kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.745680 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.745829 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.746542 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.746820 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.746847 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id\") pod \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\" (UID: \"33e21a69-41ac-4166-bfeb-6ec0eaff7e64\") " Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.746943 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.748641 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.750991 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts" (OuterVolumeSpecName: "scripts") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.751031 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv" (OuterVolumeSpecName: "kube-api-access-fs8gv") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "kube-api-access-fs8gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.752366 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.768818 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.795920 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data" (OuterVolumeSpecName: "config-data") pod "33e21a69-41ac-4166-bfeb-6ec0eaff7e64" (UID: "33e21a69-41ac-4166-bfeb-6ec0eaff7e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.850314 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.850344 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.850357 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8gv\" (UniqueName: \"kubernetes.io/projected/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-kube-api-access-fs8gv\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.850369 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:46 crc kubenswrapper[4565]: I1125 09:19:46.850379 4565 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33e21a69-41ac-4166-bfeb-6ec0eaff7e64-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.110806 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc2a2ff-50de-4bb9-9581-3868db5ec59e" path="/var/lib/kubelet/pods/bcc2a2ff-50de-4bb9-9581-3868db5ec59e/volumes" Nov 25 09:19:47 crc kubenswrapper[4565]: W1125 09:19:47.112250 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab6b35a_23d8_4736_9cf9_87dbec1101d1.slice/crio-119990ee1ba13f833f7aa67315574b78a4b238a60294029d3e6e283fed6306b6 WatchSource:0}: Error finding container 119990ee1ba13f833f7aa67315574b78a4b238a60294029d3e6e283fed6306b6: Status 404 returned error can't find the container with id 119990ee1ba13f833f7aa67315574b78a4b238a60294029d3e6e283fed6306b6 Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.112581 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.281554 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9gszq" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.281596 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9gszq" event={"ID":"33e21a69-41ac-4166-bfeb-6ec0eaff7e64","Type":"ContainerDied","Data":"ca9c80a53639c414326f75639004cd0157fdce9c1a2fd60c7f76f7909114dd1c"} Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.281653 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9c80a53639c414326f75639004cd0157fdce9c1a2fd60c7f76f7909114dd1c" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.284423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerStarted","Data":"119990ee1ba13f833f7aa67315574b78a4b238a60294029d3e6e283fed6306b6"} Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.356801 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d4d697775-b8wbb" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.397584 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.398093 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b76df5f9b-dzngk" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-api" containerID="cri-o://7e2778b9507db2a090098f46b3d8535fb3445c059d9aa00e260b6fe9bf076525" gracePeriod=30 Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.398206 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b76df5f9b-dzngk" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-httpd" containerID="cri-o://973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99" gracePeriod=30 Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.622756 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:19:47 crc kubenswrapper[4565]: E1125 09:19:47.636748 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" containerName="cinder-db-sync" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.636776 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" containerName="cinder-db-sync" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.637051 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" containerName="cinder-db-sync" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.637964 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.649708 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.650051 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t466t" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.650236 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.650424 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.684991 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.778560 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.784499 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.784743 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.784863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.785085 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxj2g\" (UniqueName: \"kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.785241 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.822325 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.850766 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.851843 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.893074 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.893172 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.893826 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.894055 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.894159 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxj2g\" (UniqueName: \"kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.894273 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.894972 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.905561 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.916711 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.918588 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.919484 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.920471 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxj2g\" (UniqueName: \"kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g\") pod \"cinder-scheduler-0\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " pod="openstack/cinder-scheduler-0" Nov 25 09:19:47 crc kubenswrapper[4565]: E1125 09:19:47.980540 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7875ff_89fe_4226_904b_622edafc2aac.slice/crio-conmon-973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99.scope\": RecentStats: unable to find data in memory cache]" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.993443 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.995066 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:47 crc kubenswrapper[4565]: I1125 09:19:47.998301 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.000439 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twwxh\" (UniqueName: \"kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.000667 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.000780 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.000873 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.000984 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.005262 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.006691 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102675 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102755 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102777 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102843 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.102903 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjc8\" (UniqueName: \"kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103090 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twwxh\" (UniqueName: \"kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103156 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103213 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103236 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103255 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.103288 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.104510 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.105472 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.105625 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.105851 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.123244 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twwxh\" (UniqueName: \"kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh\") pod \"dnsmasq-dns-775457b975-zp8vm\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207185 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207296 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207366 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207390 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207476 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207526 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.207546 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjc8\" (UniqueName: \"kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.208122 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.209809 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.210272 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.213876 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.221100 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.222025 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.225069 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.239523 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjc8\" (UniqueName: \"kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8\") pod \"cinder-api-0\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.320463 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerStarted","Data":"b0d1f9b50f5fc9dc981ea58a645b4449ceecf493f3fc9764d59d32f72982ef9a"} Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.329288 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.339770 4565 generic.go:334] "Generic (PLEG): container finished" podID="4a7875ff-89fe-4226-904b-622edafc2aac" containerID="973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99" exitCode=0 Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.339813 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerDied","Data":"973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99"} Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.558897 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:19:48 crc kubenswrapper[4565]: W1125 09:19:48.574411 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd85173b8_4999_4cd5_90e0_bae68dd0eeca.slice/crio-b977f5034bddd21612d8ee2fdd1a2844133dc89efbee62ef39c08f38b6bb04cf WatchSource:0}: Error finding container b977f5034bddd21612d8ee2fdd1a2844133dc89efbee62ef39c08f38b6bb04cf: Status 404 returned error can't find the container with id b977f5034bddd21612d8ee2fdd1a2844133dc89efbee62ef39c08f38b6bb04cf Nov 25 09:19:48 crc kubenswrapper[4565]: W1125 09:19:48.878174 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5152cfbe_d229_461a_b9f0_07370920821b.slice/crio-2a4e509f1b12160133709ffeb928c9bff3db23924301f69fb6cbe9b5edd68623 WatchSource:0}: Error finding container 2a4e509f1b12160133709ffeb928c9bff3db23924301f69fb6cbe9b5edd68623: Status 404 returned error can't find the container with id 2a4e509f1b12160133709ffeb928c9bff3db23924301f69fb6cbe9b5edd68623 Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.878997 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:19:48 crc kubenswrapper[4565]: I1125 09:19:48.963137 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:48 crc kubenswrapper[4565]: W1125 09:19:48.980183 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba79431b_1632_436a_86bc_ff20d359a24f.slice/crio-b825ed4e767c42c26fffaedeadfeeb803084248100308ab81b6fc3c12574ba81 WatchSource:0}: Error finding container b825ed4e767c42c26fffaedeadfeeb803084248100308ab81b6fc3c12574ba81: Status 404 returned error can't find the container with id b825ed4e767c42c26fffaedeadfeeb803084248100308ab81b6fc3c12574ba81 Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.351850 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerStarted","Data":"b977f5034bddd21612d8ee2fdd1a2844133dc89efbee62ef39c08f38b6bb04cf"} Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.354457 4565 generic.go:334] "Generic (PLEG): container finished" podID="5152cfbe-d229-461a-b9f0-07370920821b" containerID="705315a3da171a0197161b151d60aa9f71ed01f35ad377449880462353da093e" exitCode=0 Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.354540 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775457b975-zp8vm" event={"ID":"5152cfbe-d229-461a-b9f0-07370920821b","Type":"ContainerDied","Data":"705315a3da171a0197161b151d60aa9f71ed01f35ad377449880462353da093e"} Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.354581 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775457b975-zp8vm" event={"ID":"5152cfbe-d229-461a-b9f0-07370920821b","Type":"ContainerStarted","Data":"2a4e509f1b12160133709ffeb928c9bff3db23924301f69fb6cbe9b5edd68623"} Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.360480 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerStarted","Data":"70bf1177a68a3ed02b2b85b7f496bb303d25b49149aa3fdb0f7ca2bb65879d1d"} Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.364292 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerStarted","Data":"b825ed4e767c42c26fffaedeadfeeb803084248100308ab81b6fc3c12574ba81"} Nov 25 09:19:49 crc kubenswrapper[4565]: I1125 09:19:49.659903 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:50 crc kubenswrapper[4565]: I1125 09:19:50.409009 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerStarted","Data":"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c"} Nov 25 09:19:50 crc kubenswrapper[4565]: I1125 09:19:50.423943 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775457b975-zp8vm" event={"ID":"5152cfbe-d229-461a-b9f0-07370920821b","Type":"ContainerStarted","Data":"b6e65b11377ed310c9c5c120c2ea0020e10a78cd66e546f1cf3e393fa6cf51dc"} Nov 25 09:19:50 crc kubenswrapper[4565]: I1125 09:19:50.424863 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:50 crc kubenswrapper[4565]: I1125 09:19:50.476487 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerStarted","Data":"a710b564acc4f167fb40735b405bac7a40c546f8508f0fafc908919423c0a3ac"} Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.486439 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerStarted","Data":"67a93da19e1a3aac3d51f5ae7df42f5ed58a8d04eb4e5f9b3775dc5ab2c19836"} Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.487237 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.488945 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerStarted","Data":"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6"} Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.489094 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.489077 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api-log" containerID="cri-o://0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" gracePeriod=30 Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.489152 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api" containerID="cri-o://8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" gracePeriod=30 Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.492702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerStarted","Data":"a54c509f400d7d8fb04d2948975467d18f53f2776f3fd62b04bef1ae5fa0aaff"} Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.492828 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerStarted","Data":"b5e49c5c21aed59471de4026ada558e6264593b7523ca34429f5b3127979c9a8"} Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.510636 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-775457b975-zp8vm" podStartSLOduration=4.51062203 podStartE2EDuration="4.51062203s" podCreationTimestamp="2025-11-25 09:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:50.485371109 +0000 UTC m=+923.687866237" watchObservedRunningTime="2025-11-25 09:19:51.51062203 +0000 UTC m=+924.713117169" Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.514726 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.579440887 podStartE2EDuration="5.514713974s" podCreationTimestamp="2025-11-25 09:19:46 +0000 UTC" firstStartedPulling="2025-11-25 09:19:47.114536575 +0000 UTC m=+920.317031712" lastFinishedPulling="2025-11-25 09:19:51.04980966 +0000 UTC m=+924.252304799" observedRunningTime="2025-11-25 09:19:51.510165499 +0000 UTC m=+924.712660637" watchObservedRunningTime="2025-11-25 09:19:51.514713974 +0000 UTC m=+924.717209112" Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.538433 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.538411758 podStartE2EDuration="4.538411758s" podCreationTimestamp="2025-11-25 09:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:51.535378761 +0000 UTC m=+924.737873899" watchObservedRunningTime="2025-11-25 09:19:51.538411758 +0000 UTC m=+924.740906896" Nov 25 09:19:51 crc kubenswrapper[4565]: I1125 09:19:51.561448 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.467523483 podStartE2EDuration="4.561431554s" podCreationTimestamp="2025-11-25 09:19:47 +0000 UTC" firstStartedPulling="2025-11-25 09:19:48.593739803 +0000 UTC m=+921.796234941" lastFinishedPulling="2025-11-25 09:19:49.687647874 +0000 UTC m=+922.890143012" observedRunningTime="2025-11-25 09:19:51.551963065 +0000 UTC m=+924.754458203" watchObservedRunningTime="2025-11-25 09:19:51.561431554 +0000 UTC m=+924.763926692" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.489154 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521615 4565 generic.go:334] "Generic (PLEG): container finished" podID="ba79431b-1632-436a-86bc-ff20d359a24f" containerID="8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" exitCode=0 Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521648 4565 generic.go:334] "Generic (PLEG): container finished" podID="ba79431b-1632-436a-86bc-ff20d359a24f" containerID="0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" exitCode=143 Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521695 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerDied","Data":"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6"} Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521726 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerDied","Data":"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c"} Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521737 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba79431b-1632-436a-86bc-ff20d359a24f","Type":"ContainerDied","Data":"b825ed4e767c42c26fffaedeadfeeb803084248100308ab81b6fc3c12574ba81"} Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521754 4565 scope.go:117] "RemoveContainer" containerID="8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.521880 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.544485 4565 generic.go:334] "Generic (PLEG): container finished" podID="4a7875ff-89fe-4226-904b-622edafc2aac" containerID="7e2778b9507db2a090098f46b3d8535fb3445c059d9aa00e260b6fe9bf076525" exitCode=0 Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.545988 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerDied","Data":"7e2778b9507db2a090098f46b3d8535fb3445c059d9aa00e260b6fe9bf076525"} Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.620406 4565 scope.go:117] "RemoveContainer" containerID="0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.629663 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657131 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kjc8\" (UniqueName: \"kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657163 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657209 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657296 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657317 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657368 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657396 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs\") pod \"ba79431b-1632-436a-86bc-ff20d359a24f\" (UID: \"ba79431b-1632-436a-86bc-ff20d359a24f\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.657453 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.658784 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba79431b-1632-436a-86bc-ff20d359a24f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.663135 4565 scope.go:117] "RemoveContainer" containerID="8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.663994 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs" (OuterVolumeSpecName: "logs") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.673035 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6\": container with ID starting with 8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6 not found: ID does not exist" containerID="8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.673075 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6"} err="failed to get container status \"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6\": rpc error: code = NotFound desc = could not find container \"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6\": container with ID starting with 8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6 not found: ID does not exist" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.673101 4565 scope.go:117] "RemoveContainer" containerID="0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.676999 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c\": container with ID starting with 0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c not found: ID does not exist" containerID="0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.677027 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c"} err="failed to get container status \"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c\": rpc error: code = NotFound desc = could not find container \"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c\": container with ID starting with 0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c not found: ID does not exist" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.677047 4565 scope.go:117] "RemoveContainer" containerID="8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.683875 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts" (OuterVolumeSpecName: "scripts") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.684387 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8" (OuterVolumeSpecName: "kube-api-access-4kjc8") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "kube-api-access-4kjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.684391 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6"} err="failed to get container status \"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6\": rpc error: code = NotFound desc = could not find container \"8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6\": container with ID starting with 8d4f83376b7c62cea053850d5772011c50b1c515c0f8afd263c60e919c637cb6 not found: ID does not exist" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.684438 4565 scope.go:117] "RemoveContainer" containerID="0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.686083 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c"} err="failed to get container status \"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c\": rpc error: code = NotFound desc = could not find container \"0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c\": container with ID starting with 0eb4635a97793b3bb616bcbe99baaea1aa1d29ef6cf4364c40c1b2f00a09003c not found: ID does not exist" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.696700 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.724989 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.752427 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data" (OuterVolumeSpecName: "config-data") pod "ba79431b-1632-436a-86bc-ff20d359a24f" (UID: "ba79431b-1632-436a-86bc-ff20d359a24f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.759787 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config\") pod \"4a7875ff-89fe-4226-904b-622edafc2aac\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.759918 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf28g\" (UniqueName: \"kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g\") pod \"4a7875ff-89fe-4226-904b-622edafc2aac\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.759972 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs\") pod \"4a7875ff-89fe-4226-904b-622edafc2aac\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.760149 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle\") pod \"4a7875ff-89fe-4226-904b-622edafc2aac\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.760220 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config\") pod \"4a7875ff-89fe-4226-904b-622edafc2aac\" (UID: \"4a7875ff-89fe-4226-904b-622edafc2aac\") " Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761013 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kjc8\" (UniqueName: \"kubernetes.io/projected/ba79431b-1632-436a-86bc-ff20d359a24f-kube-api-access-4kjc8\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761035 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761046 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761061 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761072 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba79431b-1632-436a-86bc-ff20d359a24f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.761083 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba79431b-1632-436a-86bc-ff20d359a24f-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.780085 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g" (OuterVolumeSpecName: "kube-api-access-cf28g") pod "4a7875ff-89fe-4226-904b-622edafc2aac" (UID: "4a7875ff-89fe-4226-904b-622edafc2aac"). InnerVolumeSpecName "kube-api-access-cf28g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.786265 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4a7875ff-89fe-4226-904b-622edafc2aac" (UID: "4a7875ff-89fe-4226-904b-622edafc2aac"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.838032 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config" (OuterVolumeSpecName: "config") pod "4a7875ff-89fe-4226-904b-622edafc2aac" (UID: "4a7875ff-89fe-4226-904b-622edafc2aac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.841202 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a7875ff-89fe-4226-904b-622edafc2aac" (UID: "4a7875ff-89fe-4226-904b-622edafc2aac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.859018 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.861794 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.866247 4565 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.866339 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf28g\" (UniqueName: \"kubernetes.io/projected/4a7875ff-89fe-4226-904b-622edafc2aac-kube-api-access-cf28g\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.866392 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.866456 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879104 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.879600 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879622 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api" Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.879636 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api-log" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879642 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api-log" Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.879683 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-api" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879690 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-api" Nov 25 09:19:52 crc kubenswrapper[4565]: E1125 09:19:52.879706 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-httpd" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879712 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-httpd" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879955 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api-log" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879971 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-api" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879982 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" containerName="neutron-httpd" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.879992 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" containerName="cinder-api" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.886015 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4a7875ff-89fe-4226-904b-622edafc2aac" (UID: "4a7875ff-89fe-4226-904b-622edafc2aac"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.886493 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.887021 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.898448 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.898510 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.898722 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.968923 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb942\" (UniqueName: \"kubernetes.io/projected/dd508fe3-6c52-4b41-a308-7f8697523a81-kube-api-access-jb942\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969004 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-scripts\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969048 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd508fe3-6c52-4b41-a308-7f8697523a81-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969068 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969117 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969186 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969207 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969243 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd508fe3-6c52-4b41-a308-7f8697523a81-logs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969294 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:52 crc kubenswrapper[4565]: I1125 09:19:52.969349 4565 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7875ff-89fe-4226-904b-622edafc2aac-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.009100 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.070859 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-scripts\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.071031 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd508fe3-6c52-4b41-a308-7f8697523a81-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.071146 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.071262 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.071773 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.072173 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.072260 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd508fe3-6c52-4b41-a308-7f8697523a81-logs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.072373 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.072461 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb942\" (UniqueName: \"kubernetes.io/projected/dd508fe3-6c52-4b41-a308-7f8697523a81-kube-api-access-jb942\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.071152 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd508fe3-6c52-4b41-a308-7f8697523a81-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.074205 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd508fe3-6c52-4b41-a308-7f8697523a81-logs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.077760 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.078774 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.081129 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.081503 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-scripts\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.081586 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-config-data\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.082321 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd508fe3-6c52-4b41-a308-7f8697523a81-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.089085 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb942\" (UniqueName: \"kubernetes.io/projected/dd508fe3-6c52-4b41-a308-7f8697523a81-kube-api-access-jb942\") pod \"cinder-api-0\" (UID: \"dd508fe3-6c52-4b41-a308-7f8697523a81\") " pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.114766 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba79431b-1632-436a-86bc-ff20d359a24f" path="/var/lib/kubelet/pods/ba79431b-1632-436a-86bc-ff20d359a24f/volumes" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.204311 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.409093 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.509848 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b4c469f64-4jfxz" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.559209 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b76df5f9b-dzngk" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.559130 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b76df5f9b-dzngk" event={"ID":"4a7875ff-89fe-4226-904b-622edafc2aac","Type":"ContainerDied","Data":"dd4c715d8252ab5be1315e1e80cd94cd8c9bcf306b09abb20a281e8ad214e2a0"} Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.561239 4565 scope.go:117] "RemoveContainer" containerID="973f7a2aeb8540746e3853b1737e057059600417fb0500a77215b381aa150c99" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.599349 4565 scope.go:117] "RemoveContainer" containerID="7e2778b9507db2a090098f46b3d8535fb3445c059d9aa00e260b6fe9bf076525" Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.602373 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.602615 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-597dbd885b-gqpd6" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api-log" containerID="cri-o://59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743" gracePeriod=30 Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.602916 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-597dbd885b-gqpd6" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api" containerID="cri-o://baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d" gracePeriod=30 Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.620288 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.627305 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b76df5f9b-dzngk"] Nov 25 09:19:53 crc kubenswrapper[4565]: I1125 09:19:53.671440 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 09:19:54 crc kubenswrapper[4565]: I1125 09:19:54.573716 4565 generic.go:334] "Generic (PLEG): container finished" podID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerID="59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743" exitCode=143 Nov 25 09:19:54 crc kubenswrapper[4565]: I1125 09:19:54.574364 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerDied","Data":"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743"} Nov 25 09:19:54 crc kubenswrapper[4565]: I1125 09:19:54.577616 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd508fe3-6c52-4b41-a308-7f8697523a81","Type":"ContainerStarted","Data":"aed0634de7d10e41d6b3cf0e234f632a067e81ede8e2d01059657fe3d0fcb447"} Nov 25 09:19:54 crc kubenswrapper[4565]: I1125 09:19:54.577643 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd508fe3-6c52-4b41-a308-7f8697523a81","Type":"ContainerStarted","Data":"e42b6d07fbca6b5922603d651ba60121057689a97e4826ee7af705f344785c3d"} Nov 25 09:19:55 crc kubenswrapper[4565]: I1125 09:19:55.099712 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:19:55 crc kubenswrapper[4565]: I1125 09:19:55.099777 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:19:55 crc kubenswrapper[4565]: I1125 09:19:55.111599 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7875ff-89fe-4226-904b-622edafc2aac" path="/var/lib/kubelet/pods/4a7875ff-89fe-4226-904b-622edafc2aac/volumes" Nov 25 09:19:55 crc kubenswrapper[4565]: I1125 09:19:55.590371 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd508fe3-6c52-4b41-a308-7f8697523a81","Type":"ContainerStarted","Data":"67aadbb9cb231ce20b0a1c222c42b54afdf8b120a5dd327dd37658f8cdb00b39"} Nov 25 09:19:55 crc kubenswrapper[4565]: I1125 09:19:55.591488 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.497297 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.560213 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.560191624 podStartE2EDuration="5.560191624s" podCreationTimestamp="2025-11-25 09:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:19:55.610586287 +0000 UTC m=+928.813081425" watchObservedRunningTime="2025-11-25 09:19:57.560191624 +0000 UTC m=+930.762686762" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.613856 4565 generic.go:334] "Generic (PLEG): container finished" podID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerID="baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d" exitCode=0 Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.613958 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-597dbd885b-gqpd6" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.614032 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerDied","Data":"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d"} Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.614073 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-597dbd885b-gqpd6" event={"ID":"1403724f-a8aa-495c-9c1d-1e7a3eccb889","Type":"ContainerDied","Data":"d2929054be334019c0ac10900517f1c395d0e5064e4f6248e6c9c2c84495acd9"} Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.614097 4565 scope.go:117] "RemoveContainer" containerID="baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.639418 4565 scope.go:117] "RemoveContainer" containerID="59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.667822 4565 scope.go:117] "RemoveContainer" containerID="baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.672473 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data\") pod \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.672540 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle\") pod \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.672791 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7v4\" (UniqueName: \"kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4\") pod \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.672834 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs\") pod \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.672858 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom\") pod \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\" (UID: \"1403724f-a8aa-495c-9c1d-1e7a3eccb889\") " Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.675340 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs" (OuterVolumeSpecName: "logs") pod "1403724f-a8aa-495c-9c1d-1e7a3eccb889" (UID: "1403724f-a8aa-495c-9c1d-1e7a3eccb889"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:19:57 crc kubenswrapper[4565]: E1125 09:19:57.675774 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d\": container with ID starting with baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d not found: ID does not exist" containerID="baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.675821 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d"} err="failed to get container status \"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d\": rpc error: code = NotFound desc = could not find container \"baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d\": container with ID starting with baff7536cf37610b159060ba529918b8cb17a84baeb0158d01623c8b9021d26d not found: ID does not exist" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.675853 4565 scope.go:117] "RemoveContainer" containerID="59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743" Nov 25 09:19:57 crc kubenswrapper[4565]: E1125 09:19:57.677363 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743\": container with ID starting with 59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743 not found: ID does not exist" containerID="59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.677405 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743"} err="failed to get container status \"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743\": rpc error: code = NotFound desc = could not find container \"59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743\": container with ID starting with 59d56bec96647d30edc726f1e42a90066ba80f0c304b540c9e72960bc9dac743 not found: ID does not exist" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.699613 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1403724f-a8aa-495c-9c1d-1e7a3eccb889" (UID: "1403724f-a8aa-495c-9c1d-1e7a3eccb889"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.699622 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4" (OuterVolumeSpecName: "kube-api-access-sd7v4") pod "1403724f-a8aa-495c-9c1d-1e7a3eccb889" (UID: "1403724f-a8aa-495c-9c1d-1e7a3eccb889"). InnerVolumeSpecName "kube-api-access-sd7v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.708148 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1403724f-a8aa-495c-9c1d-1e7a3eccb889" (UID: "1403724f-a8aa-495c-9c1d-1e7a3eccb889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.738311 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data" (OuterVolumeSpecName: "config-data") pod "1403724f-a8aa-495c-9c1d-1e7a3eccb889" (UID: "1403724f-a8aa-495c-9c1d-1e7a3eccb889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.775169 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd7v4\" (UniqueName: \"kubernetes.io/projected/1403724f-a8aa-495c-9c1d-1e7a3eccb889-kube-api-access-sd7v4\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.775197 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1403724f-a8aa-495c-9c1d-1e7a3eccb889-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.775207 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.775216 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.775229 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1403724f-a8aa-495c-9c1d-1e7a3eccb889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.951876 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:57 crc kubenswrapper[4565]: I1125 09:19:57.964225 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-597dbd885b-gqpd6"] Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.212066 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.311429 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.358897 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.360004 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844b557b9c-554f5" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="dnsmasq-dns" containerID="cri-o://558bfa667fe8afb5c253572686950fce1d3220bf2ccd8d52dead554f3ab0d74c" gracePeriod=10 Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.389715 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.591964 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.592032 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66b84f9bd8-pdssf" Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.646942 4565 generic.go:334] "Generic (PLEG): container finished" podID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerID="558bfa667fe8afb5c253572686950fce1d3220bf2ccd8d52dead554f3ab0d74c" exitCode=0 Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.647004 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b557b9c-554f5" event={"ID":"578e3cd4-5b76-4685-ad96-81427aff9bca","Type":"ContainerDied","Data":"558bfa667fe8afb5c253572686950fce1d3220bf2ccd8d52dead554f3ab0d74c"} Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.649025 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="cinder-scheduler" containerID="cri-o://b5e49c5c21aed59471de4026ada558e6264593b7523ca34429f5b3127979c9a8" gracePeriod=30 Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.649528 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="probe" containerID="cri-o://a54c509f400d7d8fb04d2948975467d18f53f2776f3fd62b04bef1ae5fa0aaff" gracePeriod=30 Nov 25 09:19:58 crc kubenswrapper[4565]: I1125 09:19:58.893725 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.013749 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config\") pod \"578e3cd4-5b76-4685-ad96-81427aff9bca\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.013957 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb\") pod \"578e3cd4-5b76-4685-ad96-81427aff9bca\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.014033 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb\") pod \"578e3cd4-5b76-4685-ad96-81427aff9bca\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.014111 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc\") pod \"578e3cd4-5b76-4685-ad96-81427aff9bca\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.014161 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfh78\" (UniqueName: \"kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78\") pod \"578e3cd4-5b76-4685-ad96-81427aff9bca\" (UID: \"578e3cd4-5b76-4685-ad96-81427aff9bca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.019918 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78" (OuterVolumeSpecName: "kube-api-access-pfh78") pod "578e3cd4-5b76-4685-ad96-81427aff9bca" (UID: "578e3cd4-5b76-4685-ad96-81427aff9bca"). InnerVolumeSpecName "kube-api-access-pfh78". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.054140 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "578e3cd4-5b76-4685-ad96-81427aff9bca" (UID: "578e3cd4-5b76-4685-ad96-81427aff9bca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.057683 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "578e3cd4-5b76-4685-ad96-81427aff9bca" (UID: "578e3cd4-5b76-4685-ad96-81427aff9bca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.066325 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "578e3cd4-5b76-4685-ad96-81427aff9bca" (UID: "578e3cd4-5b76-4685-ad96-81427aff9bca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.069186 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config" (OuterVolumeSpecName: "config") pod "578e3cd4-5b76-4685-ad96-81427aff9bca" (UID: "578e3cd4-5b76-4685-ad96-81427aff9bca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.107994 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" path="/var/lib/kubelet/pods/1403724f-a8aa-495c-9c1d-1e7a3eccb889/volumes" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.116578 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.116623 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.116635 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfh78\" (UniqueName: \"kubernetes.io/projected/578e3cd4-5b76-4685-ad96-81427aff9bca-kube-api-access-pfh78\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.116647 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.116656 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/578e3cd4-5b76-4685-ad96-81427aff9bca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.665027 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b557b9c-554f5" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.665904 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b557b9c-554f5" event={"ID":"578e3cd4-5b76-4685-ad96-81427aff9bca","Type":"ContainerDied","Data":"19157cc860a6625999a631401bbea898f9ef6db4c06d969931e6abe0d243d144"} Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.665956 4565 scope.go:117] "RemoveContainer" containerID="558bfa667fe8afb5c253572686950fce1d3220bf2ccd8d52dead554f3ab0d74c" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.678730 4565 generic.go:334] "Generic (PLEG): container finished" podID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerID="a54c509f400d7d8fb04d2948975467d18f53f2776f3fd62b04bef1ae5fa0aaff" exitCode=0 Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.678770 4565 generic.go:334] "Generic (PLEG): container finished" podID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerID="b5e49c5c21aed59471de4026ada558e6264593b7523ca34429f5b3127979c9a8" exitCode=0 Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.678791 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerDied","Data":"a54c509f400d7d8fb04d2948975467d18f53f2776f3fd62b04bef1ae5fa0aaff"} Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.678830 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerDied","Data":"b5e49c5c21aed59471de4026ada558e6264593b7523ca34429f5b3127979c9a8"} Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.686859 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.699538 4565 scope.go:117] "RemoveContainer" containerID="64290a60338d32e576ac585404b1e30432b1127a09f36bcab6c153dd64079490" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.702171 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844b557b9c-554f5"] Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.782335 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.933834 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.934073 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.934126 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.934241 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.934508 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxj2g\" (UniqueName: \"kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.934555 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id\") pod \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\" (UID: \"d85173b8-4999-4cd5-90e0-bae68dd0eeca\") " Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.935207 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.935626 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d85173b8-4999-4cd5-90e0-bae68dd0eeca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.940895 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.941210 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts" (OuterVolumeSpecName: "scripts") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.942467 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g" (OuterVolumeSpecName: "kube-api-access-bxj2g") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "kube-api-access-bxj2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:19:59 crc kubenswrapper[4565]: I1125 09:19:59.986761 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.020062 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data" (OuterVolumeSpecName: "config-data") pod "d85173b8-4999-4cd5-90e0-bae68dd0eeca" (UID: "d85173b8-4999-4cd5-90e0-bae68dd0eeca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.039585 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.039623 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxj2g\" (UniqueName: \"kubernetes.io/projected/d85173b8-4999-4cd5-90e0-bae68dd0eeca-kube-api-access-bxj2g\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.039645 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.039657 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.039666 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85173b8-4999-4cd5-90e0-bae68dd0eeca-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.689214 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d85173b8-4999-4cd5-90e0-bae68dd0eeca","Type":"ContainerDied","Data":"b977f5034bddd21612d8ee2fdd1a2844133dc89efbee62ef39c08f38b6bb04cf"} Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.689256 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.689980 4565 scope.go:117] "RemoveContainer" containerID="a54c509f400d7d8fb04d2948975467d18f53f2776f3fd62b04bef1ae5fa0aaff" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.713135 4565 scope.go:117] "RemoveContainer" containerID="b5e49c5c21aed59471de4026ada558e6264593b7523ca34429f5b3127979c9a8" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.722102 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.736589 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.756889 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757333 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="init" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757354 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="init" Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757376 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757385 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api" Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757399 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="dnsmasq-dns" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757406 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="dnsmasq-dns" Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757413 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="probe" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757419 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="probe" Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757433 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="cinder-scheduler" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757440 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="cinder-scheduler" Nov 25 09:20:00 crc kubenswrapper[4565]: E1125 09:20:00.757450 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api-log" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757455 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api-log" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757627 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757641 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="1403724f-a8aa-495c-9c1d-1e7a3eccb889" containerName="barbican-api-log" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757655 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" containerName="dnsmasq-dns" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757670 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="cinder-scheduler" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.757678 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" containerName="probe" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.758628 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.764393 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.772651 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.856804 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.857049 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.857292 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.857361 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.857447 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnkwq\" (UniqueName: \"kubernetes.io/projected/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-kube-api-access-rnkwq\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.857581 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959273 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959338 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959390 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959420 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959448 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnkwq\" (UniqueName: \"kubernetes.io/projected/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-kube-api-access-rnkwq\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959494 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.959583 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.964260 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.964378 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.966423 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.974383 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:00 crc kubenswrapper[4565]: I1125 09:20:00.978820 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnkwq\" (UniqueName: \"kubernetes.io/projected/192bb64d-39b3-4dad-a57b-65afe8c7ec7e-kube-api-access-rnkwq\") pod \"cinder-scheduler-0\" (UID: \"192bb64d-39b3-4dad-a57b-65afe8c7ec7e\") " pod="openstack/cinder-scheduler-0" Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.099264 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.116821 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578e3cd4-5b76-4685-ad96-81427aff9bca" path="/var/lib/kubelet/pods/578e3cd4-5b76-4685-ad96-81427aff9bca/volumes" Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.117635 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85173b8-4999-4cd5-90e0-bae68dd0eeca" path="/var/lib/kubelet/pods/d85173b8-4999-4cd5-90e0-bae68dd0eeca/volumes" Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.562652 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-757f65c548-72sgl" Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.607782 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 09:20:01 crc kubenswrapper[4565]: I1125 09:20:01.740172 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"192bb64d-39b3-4dad-a57b-65afe8c7ec7e","Type":"ContainerStarted","Data":"6da85d806626c957ada8950a3e26e657e4b9af08e41dbceb2a451635615fa825"} Nov 25 09:20:02 crc kubenswrapper[4565]: I1125 09:20:02.754124 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"192bb64d-39b3-4dad-a57b-65afe8c7ec7e","Type":"ContainerStarted","Data":"24fba458fc236ade1fd28c9b1cd3d8eafab2376b94f44d1a618b3c3534e75ce4"} Nov 25 09:20:02 crc kubenswrapper[4565]: I1125 09:20:02.754534 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"192bb64d-39b3-4dad-a57b-65afe8c7ec7e","Type":"ContainerStarted","Data":"7440713beb532ee43ee6924c63408c3fb3c69a4a0da7c04da947c79684e9d113"} Nov 25 09:20:02 crc kubenswrapper[4565]: I1125 09:20:02.778636 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.778612912 podStartE2EDuration="2.778612912s" podCreationTimestamp="2025-11-25 09:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:20:02.772806356 +0000 UTC m=+935.975301494" watchObservedRunningTime="2025-11-25 09:20:02.778612912 +0000 UTC m=+935.981108051" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.764869 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.766053 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.770391 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.770656 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z8wjg" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.770850 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.798805 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.848342 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhcfj\" (UniqueName: \"kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.848641 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.848725 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.848901 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.923498 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.951454 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhcfj\" (UniqueName: \"kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.951531 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.951575 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.953979 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.958691 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.960055 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.974337 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhcfj\" (UniqueName: \"kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:04 crc kubenswrapper[4565]: I1125 09:20:04.974428 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.054661 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.056276 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.086471 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.157301 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.165246 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.167671 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:05 crc kubenswrapper[4565]: E1125 09:20:05.223209 4565 log.go:32] "RunPodSandbox from runtime service failed" err=< Nov 25 09:20:05 crc kubenswrapper[4565]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e_0(40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb" Netns:"/var/run/netns/5858bf27-18ab-4bcf-8f06-52aafcdc777f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb;K8S_POD_UID=f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e]: expected pod UID "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" but got "7dfd09a5-8627-4394-ac4f-367458ffe0b2" from Kube API Nov 25 09:20:05 crc kubenswrapper[4565]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 25 09:20:05 crc kubenswrapper[4565]: > Nov 25 09:20:05 crc kubenswrapper[4565]: E1125 09:20:05.223706 4565 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Nov 25 09:20:05 crc kubenswrapper[4565]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e_0(40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb" Netns:"/var/run/netns/5858bf27-18ab-4bcf-8f06-52aafcdc777f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=40b6c4951241ae7607aa79a4f33699f7426c934ca04a06749cbd695cf62346fb;K8S_POD_UID=f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e]: expected pod UID "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" but got "7dfd09a5-8627-4394-ac4f-367458ffe0b2" from Kube API Nov 25 09:20:05 crc kubenswrapper[4565]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 25 09:20:05 crc kubenswrapper[4565]: > pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.281538 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.281594 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.281642 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.281752 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7wn\" (UniqueName: \"kubernetes.io/projected/7dfd09a5-8627-4394-ac4f-367458ffe0b2-kube-api-access-zz7wn\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.384050 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7wn\" (UniqueName: \"kubernetes.io/projected/7dfd09a5-8627-4394-ac4f-367458ffe0b2-kube-api-access-zz7wn\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.384221 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.384268 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.384346 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.385333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.389243 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.394095 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7dfd09a5-8627-4394-ac4f-367458ffe0b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.400309 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7wn\" (UniqueName: \"kubernetes.io/projected/7dfd09a5-8627-4394-ac4f-367458ffe0b2-kube-api-access-zz7wn\") pod \"openstackclient\" (UID: \"7dfd09a5-8627-4394-ac4f-367458ffe0b2\") " pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.491761 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.730619 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.799508 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.800348 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7dfd09a5-8627-4394-ac4f-367458ffe0b2","Type":"ContainerStarted","Data":"8aeddb6a1810a504936949d36b286c8c083ffcdafeb6eaa3dc9eed3b3b8028b4"} Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.819610 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.826386 4565 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" podUID="7dfd09a5-8627-4394-ac4f-367458ffe0b2" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.896884 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config\") pod \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.896994 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret\") pod \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.897032 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhcfj\" (UniqueName: \"kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj\") pod \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.897209 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle\") pod \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\" (UID: \"f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e\") " Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.898079 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" (UID: "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.899595 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.905162 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" (UID: "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.907004 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj" (OuterVolumeSpecName: "kube-api-access-lhcfj") pod "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" (UID: "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e"). InnerVolumeSpecName "kube-api-access-lhcfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:05 crc kubenswrapper[4565]: I1125 09:20:05.907118 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" (UID: "f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.001590 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.001628 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhcfj\" (UniqueName: \"kubernetes.io/projected/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-kube-api-access-lhcfj\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.001640 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.099607 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.808334 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 09:20:06 crc kubenswrapper[4565]: I1125 09:20:06.821942 4565 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" podUID="7dfd09a5-8627-4394-ac4f-367458ffe0b2" Nov 25 09:20:07 crc kubenswrapper[4565]: I1125 09:20:07.110697 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e" path="/var/lib/kubelet/pods/f52d128b-ea44-4fd6-aa2a-4fc0d516ac2e/volumes" Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.361280 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.362049 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-central-agent" containerID="cri-o://b0d1f9b50f5fc9dc981ea58a645b4449ceecf493f3fc9764d59d32f72982ef9a" gracePeriod=30 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.362998 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="proxy-httpd" containerID="cri-o://67a93da19e1a3aac3d51f5ae7df42f5ed58a8d04eb4e5f9b3775dc5ab2c19836" gracePeriod=30 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.363083 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="sg-core" containerID="cri-o://a710b564acc4f167fb40735b405bac7a40c546f8508f0fafc908919423c0a3ac" gracePeriod=30 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.363127 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-notification-agent" containerID="cri-o://70bf1177a68a3ed02b2b85b7f496bb303d25b49149aa3fdb0f7ca2bb65879d1d" gracePeriod=30 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.386975 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.150:3000/\": EOF" Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863378 4565 generic.go:334] "Generic (PLEG): container finished" podID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerID="67a93da19e1a3aac3d51f5ae7df42f5ed58a8d04eb4e5f9b3775dc5ab2c19836" exitCode=0 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863742 4565 generic.go:334] "Generic (PLEG): container finished" podID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerID="a710b564acc4f167fb40735b405bac7a40c546f8508f0fafc908919423c0a3ac" exitCode=2 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863754 4565 generic.go:334] "Generic (PLEG): container finished" podID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerID="b0d1f9b50f5fc9dc981ea58a645b4449ceecf493f3fc9764d59d32f72982ef9a" exitCode=0 Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863472 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerDied","Data":"67a93da19e1a3aac3d51f5ae7df42f5ed58a8d04eb4e5f9b3775dc5ab2c19836"} Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863814 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerDied","Data":"a710b564acc4f167fb40735b405bac7a40c546f8508f0fafc908919423c0a3ac"} Nov 25 09:20:09 crc kubenswrapper[4565]: I1125 09:20:09.863834 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerDied","Data":"b0d1f9b50f5fc9dc981ea58a645b4449ceecf493f3fc9764d59d32f72982ef9a"} Nov 25 09:20:10 crc kubenswrapper[4565]: I1125 09:20:10.880793 4565 generic.go:334] "Generic (PLEG): container finished" podID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerID="70bf1177a68a3ed02b2b85b7f496bb303d25b49149aa3fdb0f7ca2bb65879d1d" exitCode=0 Nov 25 09:20:10 crc kubenswrapper[4565]: I1125 09:20:10.880860 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerDied","Data":"70bf1177a68a3ed02b2b85b7f496bb303d25b49149aa3fdb0f7ca2bb65879d1d"} Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.140578 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.256563 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.256677 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.256766 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwdf\" (UniqueName: \"kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.256808 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.257543 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.257818 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.257943 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.258005 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle\") pod \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\" (UID: \"dab6b35a-23d8-4736-9cf9-87dbec1101d1\") " Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.258246 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.258739 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.258762 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dab6b35a-23d8-4736-9cf9-87dbec1101d1-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.283055 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts" (OuterVolumeSpecName: "scripts") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.283552 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf" (OuterVolumeSpecName: "kube-api-access-tdwdf") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "kube-api-access-tdwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.294403 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.333240 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.362047 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwdf\" (UniqueName: \"kubernetes.io/projected/dab6b35a-23d8-4736-9cf9-87dbec1101d1-kube-api-access-tdwdf\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.362086 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.362190 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.362208 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.365228 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data" (OuterVolumeSpecName: "config-data") pod "dab6b35a-23d8-4736-9cf9-87dbec1101d1" (UID: "dab6b35a-23d8-4736-9cf9-87dbec1101d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.391241 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.465464 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6b35a-23d8-4736-9cf9-87dbec1101d1-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.896849 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dab6b35a-23d8-4736-9cf9-87dbec1101d1","Type":"ContainerDied","Data":"119990ee1ba13f833f7aa67315574b78a4b238a60294029d3e6e283fed6306b6"} Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.896946 4565 scope.go:117] "RemoveContainer" containerID="67a93da19e1a3aac3d51f5ae7df42f5ed58a8d04eb4e5f9b3775dc5ab2c19836" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.897624 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.938427 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.942644 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.962328 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:11 crc kubenswrapper[4565]: E1125 09:20:11.970461 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="sg-core" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.970484 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="sg-core" Nov 25 09:20:11 crc kubenswrapper[4565]: E1125 09:20:11.970496 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-notification-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.970503 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-notification-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: E1125 09:20:11.970525 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="proxy-httpd" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.970533 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="proxy-httpd" Nov 25 09:20:11 crc kubenswrapper[4565]: E1125 09:20:11.970571 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-central-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.970580 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-central-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.970995 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-central-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.971018 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="proxy-httpd" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.971030 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="ceilometer-notification-agent" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.971037 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" containerName="sg-core" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.980915 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.986341 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.986427 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:20:11 crc kubenswrapper[4565]: I1125 09:20:11.996610 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.082784 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.082947 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.083028 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.083208 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.083243 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rn2\" (UniqueName: \"kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.083648 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.083869 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186225 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186358 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186409 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186439 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186480 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186531 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.186671 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rn2\" (UniqueName: \"kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.187345 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.187589 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.191675 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.191786 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.192112 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.193402 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.202341 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rn2\" (UniqueName: \"kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2\") pod \"ceilometer-0\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " pod="openstack/ceilometer-0" Nov 25 09:20:12 crc kubenswrapper[4565]: I1125 09:20:12.311706 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:13 crc kubenswrapper[4565]: I1125 09:20:13.108986 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab6b35a-23d8-4736-9cf9-87dbec1101d1" path="/var/lib/kubelet/pods/dab6b35a-23d8-4736-9cf9-87dbec1101d1/volumes" Nov 25 09:20:17 crc kubenswrapper[4565]: I1125 09:20:17.630201 4565 scope.go:117] "RemoveContainer" containerID="a710b564acc4f167fb40735b405bac7a40c546f8508f0fafc908919423c0a3ac" Nov 25 09:20:17 crc kubenswrapper[4565]: I1125 09:20:17.670078 4565 scope.go:117] "RemoveContainer" containerID="70bf1177a68a3ed02b2b85b7f496bb303d25b49149aa3fdb0f7ca2bb65879d1d" Nov 25 09:20:17 crc kubenswrapper[4565]: I1125 09:20:17.707885 4565 scope.go:117] "RemoveContainer" containerID="b0d1f9b50f5fc9dc981ea58a645b4449ceecf493f3fc9764d59d32f72982ef9a" Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.035252 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.061542 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7dfd09a5-8627-4394-ac4f-367458ffe0b2","Type":"ContainerStarted","Data":"9fc3d0503ac8c4b701159559fec1036947fc52d573bacd5932b77da42742987b"} Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.110732 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.160685522 podStartE2EDuration="13.110716235s" podCreationTimestamp="2025-11-25 09:20:05 +0000 UTC" firstStartedPulling="2025-11-25 09:20:05.761818861 +0000 UTC m=+938.964313989" lastFinishedPulling="2025-11-25 09:20:17.711849564 +0000 UTC m=+950.914344702" observedRunningTime="2025-11-25 09:20:18.104591088 +0000 UTC m=+951.307086227" watchObservedRunningTime="2025-11-25 09:20:18.110716235 +0000 UTC m=+951.313211374" Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.643797 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.878955 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6qlsb"] Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.886623 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:18 crc kubenswrapper[4565]: I1125 09:20:18.965350 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qlsb"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.078463 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2601-account-create-pwb9l"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.085647 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.087070 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.087230 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqkm\" (UniqueName: \"kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.089408 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.095370 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerStarted","Data":"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371"} Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.095412 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerStarted","Data":"afe9f04f1d96844dba902831a9c918fb66518aeefdbcabd5157e2d88628fce46"} Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.115055 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2601-account-create-pwb9l"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.174101 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8hwpf"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.176402 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.205107 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8d7j\" (UniqueName: \"kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.205263 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqkm\" (UniqueName: \"kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.205415 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.205524 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.208639 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.211498 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8hwpf"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.228427 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqkm\" (UniqueName: \"kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm\") pod \"nova-api-db-create-6qlsb\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.285580 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zbsrv"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.287628 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.296809 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zbsrv"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.308063 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.308179 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrmn\" (UniqueName: \"kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.308339 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.308543 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8d7j\" (UniqueName: \"kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.308974 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.309061 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4s7d\" (UniqueName: \"kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.309519 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.330451 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d911-account-create-sd45h"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.334683 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.343494 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d911-account-create-sd45h"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.344248 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8d7j\" (UniqueName: \"kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j\") pod \"nova-api-2601-account-create-pwb9l\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.355576 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410562 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410600 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrmn\" (UniqueName: \"kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410634 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410714 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5bv5\" (UniqueName: \"kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410867 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.410900 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4s7d\" (UniqueName: \"kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.411415 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.411894 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.413458 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.427213 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrmn\" (UniqueName: \"kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn\") pod \"nova-cell1-db-create-zbsrv\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.427732 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4s7d\" (UniqueName: \"kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d\") pod \"nova-cell0-db-create-8hwpf\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.481898 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4c0b-account-create-dsld7"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.483403 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.487362 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.492218 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4c0b-account-create-dsld7"] Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.508374 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.514263 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5bv5\" (UniqueName: \"kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.514542 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.515219 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.527135 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.534301 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5bv5\" (UniqueName: \"kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5\") pod \"nova-cell0-d911-account-create-sd45h\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.609158 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.620866 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.621172 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn84c\" (UniqueName: \"kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.722539 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn84c\" (UniqueName: \"kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.722663 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.725322 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.759308 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn84c\" (UniqueName: \"kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c\") pod \"nova-cell1-4c0b-account-create-dsld7\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.765392 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.825369 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:19 crc kubenswrapper[4565]: I1125 09:20:19.918353 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2601-account-create-pwb9l"] Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.054421 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8hwpf"] Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.123881 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2601-account-create-pwb9l" event={"ID":"910d79e3-ae87-4083-ab49-d472d838cca5","Type":"ContainerStarted","Data":"f07fe17ab950298563ab8a50e37ceb3bc3c44ab20ba8c82ae7043ca5a90b9f3d"} Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.132845 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8hwpf" event={"ID":"55f7705c-306a-4c2f-b4c1-3b1617c83568","Type":"ContainerStarted","Data":"bca7f935c88a6a7a56e62c7d6c135bcb5559fc6694d678b3d36734056ebc61fc"} Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.145473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerStarted","Data":"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d"} Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.223317 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qlsb"] Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.302802 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zbsrv"] Nov 25 09:20:20 crc kubenswrapper[4565]: W1125 09:20:20.314988 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28aa4f32_5ea5_4e25_aa7d_23b5d4ea2645.slice/crio-8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195 WatchSource:0}: Error finding container 8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195: Status 404 returned error can't find the container with id 8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195 Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.383564 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d911-account-create-sd45h"] Nov 25 09:20:20 crc kubenswrapper[4565]: W1125 09:20:20.398453 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc562a5bf_0aaf_4c6b_bef5_c3de522e3382.slice/crio-0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b WatchSource:0}: Error finding container 0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b: Status 404 returned error can't find the container with id 0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b Nov 25 09:20:20 crc kubenswrapper[4565]: I1125 09:20:20.515585 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4c0b-account-create-dsld7"] Nov 25 09:20:20 crc kubenswrapper[4565]: W1125 09:20:20.562108 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf03e89dd_5750_496e_8fed_117f36a6649b.slice/crio-6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f WatchSource:0}: Error finding container 6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f: Status 404 returned error can't find the container with id 6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.154842 4565 generic.go:334] "Generic (PLEG): container finished" podID="f03e89dd-5750-496e-8fed-117f36a6649b" containerID="f73fd81c11bc0a963f9638866fd24ace6521bf7bd2e5d35ec6c05f11252ffdc9" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.155000 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4c0b-account-create-dsld7" event={"ID":"f03e89dd-5750-496e-8fed-117f36a6649b","Type":"ContainerDied","Data":"f73fd81c11bc0a963f9638866fd24ace6521bf7bd2e5d35ec6c05f11252ffdc9"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.156539 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4c0b-account-create-dsld7" event={"ID":"f03e89dd-5750-496e-8fed-117f36a6649b","Type":"ContainerStarted","Data":"6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.159904 4565 generic.go:334] "Generic (PLEG): container finished" podID="28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" containerID="20a67b7c8db921e0c289e276c1d116835d692e5c5a8ad314169969c24a268e42" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.160057 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zbsrv" event={"ID":"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645","Type":"ContainerDied","Data":"20a67b7c8db921e0c289e276c1d116835d692e5c5a8ad314169969c24a268e42"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.160135 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zbsrv" event={"ID":"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645","Type":"ContainerStarted","Data":"8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.164592 4565 generic.go:334] "Generic (PLEG): container finished" podID="c562a5bf-0aaf-4c6b-bef5-c3de522e3382" containerID="64309c9963538d1e0075da47637e79925e8504cd9622ca3e4ef4ad0cfa657368" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.164714 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d911-account-create-sd45h" event={"ID":"c562a5bf-0aaf-4c6b-bef5-c3de522e3382","Type":"ContainerDied","Data":"64309c9963538d1e0075da47637e79925e8504cd9622ca3e4ef4ad0cfa657368"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.164777 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d911-account-create-sd45h" event={"ID":"c562a5bf-0aaf-4c6b-bef5-c3de522e3382","Type":"ContainerStarted","Data":"0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.167590 4565 generic.go:334] "Generic (PLEG): container finished" podID="55f7705c-306a-4c2f-b4c1-3b1617c83568" containerID="4c76c7ef4a0312ed1b7c32f9b5033d648eb62141f925e01fbc2b28f9a9473c3c" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.167711 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8hwpf" event={"ID":"55f7705c-306a-4c2f-b4c1-3b1617c83568","Type":"ContainerDied","Data":"4c76c7ef4a0312ed1b7c32f9b5033d648eb62141f925e01fbc2b28f9a9473c3c"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.170542 4565 generic.go:334] "Generic (PLEG): container finished" podID="39af6b73-ead9-4001-b7bb-990b384efbd6" containerID="17b647ac503a84a0e4a2f50f2bae9096d71806f0f90aa8b93a931bdc0994d1a6" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.170614 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qlsb" event={"ID":"39af6b73-ead9-4001-b7bb-990b384efbd6","Type":"ContainerDied","Data":"17b647ac503a84a0e4a2f50f2bae9096d71806f0f90aa8b93a931bdc0994d1a6"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.170636 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qlsb" event={"ID":"39af6b73-ead9-4001-b7bb-990b384efbd6","Type":"ContainerStarted","Data":"38c72c049c17cdfd29e443919e7512d1a678c8d43caca59dbfabbd3ff2f1a9a4"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.173828 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerStarted","Data":"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470"} Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.176124 4565 generic.go:334] "Generic (PLEG): container finished" podID="910d79e3-ae87-4083-ab49-d472d838cca5" containerID="64e136ed8f8c52e376ca5a0711f49bdc3e9d6f0c5911ed93b77c87ec1060650d" exitCode=0 Nov 25 09:20:21 crc kubenswrapper[4565]: I1125 09:20:21.176155 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2601-account-create-pwb9l" event={"ID":"910d79e3-ae87-4083-ab49-d472d838cca5","Type":"ContainerDied","Data":"64e136ed8f8c52e376ca5a0711f49bdc3e9d6f0c5911ed93b77c87ec1060650d"} Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.553733 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.595261 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5bv5\" (UniqueName: \"kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5\") pod \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.596852 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts\") pod \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\" (UID: \"c562a5bf-0aaf-4c6b-bef5-c3de522e3382\") " Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.598073 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c562a5bf-0aaf-4c6b-bef5-c3de522e3382" (UID: "c562a5bf-0aaf-4c6b-bef5-c3de522e3382"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.620831 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5" (OuterVolumeSpecName: "kube-api-access-b5bv5") pod "c562a5bf-0aaf-4c6b-bef5-c3de522e3382" (UID: "c562a5bf-0aaf-4c6b-bef5-c3de522e3382"). InnerVolumeSpecName "kube-api-access-b5bv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.700717 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5bv5\" (UniqueName: \"kubernetes.io/projected/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-kube-api-access-b5bv5\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.700758 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c562a5bf-0aaf-4c6b-bef5-c3de522e3382-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.812724 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.814878 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.819869 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.826032 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:22 crc kubenswrapper[4565]: I1125 09:20:22.828101 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.011568 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4s7d\" (UniqueName: \"kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d\") pod \"55f7705c-306a-4c2f-b4c1-3b1617c83568\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.011649 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts\") pod \"39af6b73-ead9-4001-b7bb-990b384efbd6\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.011722 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn84c\" (UniqueName: \"kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c\") pod \"f03e89dd-5750-496e-8fed-117f36a6649b\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012490 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts\") pod \"f03e89dd-5750-496e-8fed-117f36a6649b\" (UID: \"f03e89dd-5750-496e-8fed-117f36a6649b\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012481 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39af6b73-ead9-4001-b7bb-990b384efbd6" (UID: "39af6b73-ead9-4001-b7bb-990b384efbd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012531 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrmn\" (UniqueName: \"kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn\") pod \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012617 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts\") pod \"910d79e3-ae87-4083-ab49-d472d838cca5\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012675 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts\") pod \"55f7705c-306a-4c2f-b4c1-3b1617c83568\" (UID: \"55f7705c-306a-4c2f-b4c1-3b1617c83568\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012712 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8d7j\" (UniqueName: \"kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j\") pod \"910d79e3-ae87-4083-ab49-d472d838cca5\" (UID: \"910d79e3-ae87-4083-ab49-d472d838cca5\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012737 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts\") pod \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\" (UID: \"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012812 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f03e89dd-5750-496e-8fed-117f36a6649b" (UID: "f03e89dd-5750-496e-8fed-117f36a6649b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.012836 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnqkm\" (UniqueName: \"kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm\") pod \"39af6b73-ead9-4001-b7bb-990b384efbd6\" (UID: \"39af6b73-ead9-4001-b7bb-990b384efbd6\") " Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.013126 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55f7705c-306a-4c2f-b4c1-3b1617c83568" (UID: "55f7705c-306a-4c2f-b4c1-3b1617c83568"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014003 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" (UID: "28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014051 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "910d79e3-ae87-4083-ab49-d472d838cca5" (UID: "910d79e3-ae87-4083-ab49-d472d838cca5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014358 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39af6b73-ead9-4001-b7bb-990b384efbd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014387 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f03e89dd-5750-496e-8fed-117f36a6649b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014400 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910d79e3-ae87-4083-ab49-d472d838cca5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014413 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f7705c-306a-4c2f-b4c1-3b1617c83568-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.014423 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.015801 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d" (OuterVolumeSpecName: "kube-api-access-j4s7d") pod "55f7705c-306a-4c2f-b4c1-3b1617c83568" (UID: "55f7705c-306a-4c2f-b4c1-3b1617c83568"). InnerVolumeSpecName "kube-api-access-j4s7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.016557 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn" (OuterVolumeSpecName: "kube-api-access-thrmn") pod "28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" (UID: "28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645"). InnerVolumeSpecName "kube-api-access-thrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.017346 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c" (OuterVolumeSpecName: "kube-api-access-qn84c") pod "f03e89dd-5750-496e-8fed-117f36a6649b" (UID: "f03e89dd-5750-496e-8fed-117f36a6649b"). InnerVolumeSpecName "kube-api-access-qn84c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.018570 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm" (OuterVolumeSpecName: "kube-api-access-pnqkm") pod "39af6b73-ead9-4001-b7bb-990b384efbd6" (UID: "39af6b73-ead9-4001-b7bb-990b384efbd6"). InnerVolumeSpecName "kube-api-access-pnqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.019137 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j" (OuterVolumeSpecName: "kube-api-access-k8d7j") pod "910d79e3-ae87-4083-ab49-d472d838cca5" (UID: "910d79e3-ae87-4083-ab49-d472d838cca5"). InnerVolumeSpecName "kube-api-access-k8d7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.117525 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrmn\" (UniqueName: \"kubernetes.io/projected/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645-kube-api-access-thrmn\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.117676 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8d7j\" (UniqueName: \"kubernetes.io/projected/910d79e3-ae87-4083-ab49-d472d838cca5-kube-api-access-k8d7j\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.117758 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnqkm\" (UniqueName: \"kubernetes.io/projected/39af6b73-ead9-4001-b7bb-990b384efbd6-kube-api-access-pnqkm\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.117833 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4s7d\" (UniqueName: \"kubernetes.io/projected/55f7705c-306a-4c2f-b4c1-3b1617c83568-kube-api-access-j4s7d\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.117888 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn84c\" (UniqueName: \"kubernetes.io/projected/f03e89dd-5750-496e-8fed-117f36a6649b-kube-api-access-qn84c\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.198259 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2601-account-create-pwb9l" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.198306 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2601-account-create-pwb9l" event={"ID":"910d79e3-ae87-4083-ab49-d472d838cca5","Type":"ContainerDied","Data":"f07fe17ab950298563ab8a50e37ceb3bc3c44ab20ba8c82ae7043ca5a90b9f3d"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.198372 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07fe17ab950298563ab8a50e37ceb3bc3c44ab20ba8c82ae7043ca5a90b9f3d" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.201577 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4c0b-account-create-dsld7" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.201738 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4c0b-account-create-dsld7" event={"ID":"f03e89dd-5750-496e-8fed-117f36a6649b","Type":"ContainerDied","Data":"6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.201869 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e9e4f971b8470c15c38965f00e1767f8a42596d98b98676ae8d717ec1ea763f" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.203882 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zbsrv" event={"ID":"28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645","Type":"ContainerDied","Data":"8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.204000 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d8b386b3d144f827f2e4aea091f1338bdf52d1c4d677bdf73505a6e81119195" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.204192 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zbsrv" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.207779 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d911-account-create-sd45h" event={"ID":"c562a5bf-0aaf-4c6b-bef5-c3de522e3382","Type":"ContainerDied","Data":"0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.207832 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b651ba99b35b7fb02437b4a26cc0a048a19357f762936a7e88798cf56b6cc9b" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.207797 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d911-account-create-sd45h" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.210730 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8hwpf" event={"ID":"55f7705c-306a-4c2f-b4c1-3b1617c83568","Type":"ContainerDied","Data":"bca7f935c88a6a7a56e62c7d6c135bcb5559fc6694d678b3d36734056ebc61fc"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.210768 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca7f935c88a6a7a56e62c7d6c135bcb5559fc6694d678b3d36734056ebc61fc" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.211254 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8hwpf" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.213424 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qlsb" event={"ID":"39af6b73-ead9-4001-b7bb-990b384efbd6","Type":"ContainerDied","Data":"38c72c049c17cdfd29e443919e7512d1a678c8d43caca59dbfabbd3ff2f1a9a4"} Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.213520 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c72c049c17cdfd29e443919e7512d1a678c8d43caca59dbfabbd3ff2f1a9a4" Nov 25 09:20:23 crc kubenswrapper[4565]: I1125 09:20:23.213642 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qlsb" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.608133 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l7jv6"] Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609177 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609202 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609223 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c562a5bf-0aaf-4c6b-bef5-c3de522e3382" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609230 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c562a5bf-0aaf-4c6b-bef5-c3de522e3382" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609245 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f7705c-306a-4c2f-b4c1-3b1617c83568" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609252 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f7705c-306a-4c2f-b4c1-3b1617c83568" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609263 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39af6b73-ead9-4001-b7bb-990b384efbd6" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609269 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="39af6b73-ead9-4001-b7bb-990b384efbd6" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609282 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03e89dd-5750-496e-8fed-117f36a6649b" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609287 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03e89dd-5750-496e-8fed-117f36a6649b" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: E1125 09:20:24.609313 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910d79e3-ae87-4083-ab49-d472d838cca5" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609320 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="910d79e3-ae87-4083-ab49-d472d838cca5" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609553 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="910d79e3-ae87-4083-ab49-d472d838cca5" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609566 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f7705c-306a-4c2f-b4c1-3b1617c83568" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609582 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03e89dd-5750-496e-8fed-117f36a6649b" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609596 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609605 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c562a5bf-0aaf-4c6b-bef5-c3de522e3382" containerName="mariadb-account-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.609618 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="39af6b73-ead9-4001-b7bb-990b384efbd6" containerName="mariadb-database-create" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.610489 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.627251 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tjxjs" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.627510 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.628203 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.633958 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l7jv6"] Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.754818 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.754878 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.755530 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64gwm\" (UniqueName: \"kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.755656 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.858207 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.858289 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64gwm\" (UniqueName: \"kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.858345 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.858479 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.867744 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.868727 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.875058 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.879440 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64gwm\" (UniqueName: \"kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm\") pod \"nova-cell0-conductor-db-sync-l7jv6\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:24 crc kubenswrapper[4565]: I1125 09:20:24.935969 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.099458 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.099891 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.121355 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.122499 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.122566 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d" gracePeriod=600 Nov 25 09:20:25 crc kubenswrapper[4565]: I1125 09:20:25.425013 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l7jv6"] Nov 25 09:20:26 crc kubenswrapper[4565]: I1125 09:20:26.293894 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d" exitCode=0 Nov 25 09:20:26 crc kubenswrapper[4565]: I1125 09:20:26.294353 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d"} Nov 25 09:20:26 crc kubenswrapper[4565]: I1125 09:20:26.294405 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2"} Nov 25 09:20:26 crc kubenswrapper[4565]: I1125 09:20:26.294426 4565 scope.go:117] "RemoveContainer" containerID="c10fd5b53bc647595e50d0c679601ea47018805ef1aa79f6ca728fb0a4552a71" Nov 25 09:20:26 crc kubenswrapper[4565]: I1125 09:20:26.296013 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" event={"ID":"84c75e09-f927-412d-9a45-b122c13e711d","Type":"ContainerStarted","Data":"329f273d3039ca20ba00377e3f29b05f02d671f79179319d58ff6ae3437b5b72"} Nov 25 09:20:30 crc kubenswrapper[4565]: E1125 09:20:30.994467 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/bd/bde1061b70edbc1bc91b49845034313e3c0902504bdea70ec607eba1102601fc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251125%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251125T092020Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=289414932c66a72dddca90cc2844f13e5adfb14772f6067975a42ed31568a74a®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=ubi9----httpd-24&akamai_signature=exp=1764063320~hmac=f09e452a2d4b2e0abd3de843400a415991f2bc0de01e2d0755905e0633c318b8\": net/http: TLS handshake timeout" image="registry.redhat.io/ubi9/httpd-24@sha256:8536169e5537fe6c330eba814248abdcf39cdd8f7e7336034d74e6fda9544050" Nov 25 09:20:30 crc kubenswrapper[4565]: E1125 09:20:30.995480 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:8536169e5537fe6c330eba814248abdcf39cdd8f7e7336034d74e6fda9544050,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8rn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/bd/bde1061b70edbc1bc91b49845034313e3c0902504bdea70ec607eba1102601fc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251125%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251125T092020Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=289414932c66a72dddca90cc2844f13e5adfb14772f6067975a42ed31568a74a®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=ubi9----httpd-24&akamai_signature=exp=1764063320~hmac=f09e452a2d4b2e0abd3de843400a415991f2bc0de01e2d0755905e0633c318b8\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 25 09:20:30 crc kubenswrapper[4565]: E1125 09:20:30.996697 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/bd/bde1061b70edbc1bc91b49845034313e3c0902504bdea70ec607eba1102601fc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251125%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251125T092020Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=289414932c66a72dddca90cc2844f13e5adfb14772f6067975a42ed31568a74a®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=ubi9----httpd-24&akamai_signature=exp=1764063320~hmac=f09e452a2d4b2e0abd3de843400a415991f2bc0de01e2d0755905e0633c318b8\\\": net/http: TLS handshake timeout\"" pod="openstack/ceilometer-0" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" Nov 25 09:20:31 crc kubenswrapper[4565]: I1125 09:20:31.377870 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-central-agent" containerID="cri-o://d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371" gracePeriod=30 Nov 25 09:20:31 crc kubenswrapper[4565]: I1125 09:20:31.378218 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="sg-core" containerID="cri-o://915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470" gracePeriod=30 Nov 25 09:20:31 crc kubenswrapper[4565]: I1125 09:20:31.378399 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-notification-agent" containerID="cri-o://facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d" gracePeriod=30 Nov 25 09:20:32 crc kubenswrapper[4565]: I1125 09:20:32.392479 4565 generic.go:334] "Generic (PLEG): container finished" podID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerID="915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470" exitCode=2 Nov 25 09:20:32 crc kubenswrapper[4565]: I1125 09:20:32.393092 4565 generic.go:334] "Generic (PLEG): container finished" podID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerID="d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371" exitCode=0 Nov 25 09:20:32 crc kubenswrapper[4565]: I1125 09:20:32.392602 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerDied","Data":"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470"} Nov 25 09:20:32 crc kubenswrapper[4565]: I1125 09:20:32.393155 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerDied","Data":"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371"} Nov 25 09:20:34 crc kubenswrapper[4565]: I1125 09:20:34.417675 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" event={"ID":"84c75e09-f927-412d-9a45-b122c13e711d","Type":"ContainerStarted","Data":"f93849e3715173c62cb1defb1877e385c23390def24eda663a153541e78ee958"} Nov 25 09:20:34 crc kubenswrapper[4565]: I1125 09:20:34.441662 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" podStartSLOduration=2.174192985 podStartE2EDuration="10.441636668s" podCreationTimestamp="2025-11-25 09:20:24 +0000 UTC" firstStartedPulling="2025-11-25 09:20:25.425643242 +0000 UTC m=+958.628138380" lastFinishedPulling="2025-11-25 09:20:33.693086924 +0000 UTC m=+966.895582063" observedRunningTime="2025-11-25 09:20:34.436997853 +0000 UTC m=+967.639492981" watchObservedRunningTime="2025-11-25 09:20:34.441636668 +0000 UTC m=+967.644131807" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.798670 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.847285 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849309 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849394 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849660 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849753 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8rn2\" (UniqueName: \"kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849807 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.849855 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data\") pod \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\" (UID: \"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa\") " Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.852592 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.855413 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.860095 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts" (OuterVolumeSpecName: "scripts") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.860288 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.871148 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2" (OuterVolumeSpecName: "kube-api-access-d8rn2") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "kube-api-access-d8rn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.901777 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data" (OuterVolumeSpecName: "config-data") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.904105 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.926558 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" (UID: "ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957628 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957665 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957678 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957689 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8rn2\" (UniqueName: \"kubernetes.io/projected/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-kube-api-access-d8rn2\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957704 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:35 crc kubenswrapper[4565]: I1125 09:20:35.957714 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.438855 4565 generic.go:334] "Generic (PLEG): container finished" podID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerID="facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d" exitCode=0 Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.438915 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerDied","Data":"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d"} Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.438974 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa","Type":"ContainerDied","Data":"afe9f04f1d96844dba902831a9c918fb66518aeefdbcabd5157e2d88628fce46"} Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.438999 4565 scope.go:117] "RemoveContainer" containerID="915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.439378 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.461330 4565 scope.go:117] "RemoveContainer" containerID="facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.488171 4565 scope.go:117] "RemoveContainer" containerID="d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.519142 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.521162 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.549417 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.550060 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="sg-core" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550081 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="sg-core" Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.550103 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-central-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550111 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-central-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.550138 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-notification-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550144 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-notification-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550321 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="sg-core" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550344 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-central-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.550354 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" containerName="ceilometer-notification-agent" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.551868 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.554072 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.556015 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.570005 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.572279 4565 scope.go:117] "RemoveContainer" containerID="915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470" Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.573062 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470\": container with ID starting with 915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470 not found: ID does not exist" containerID="915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.573116 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470"} err="failed to get container status \"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470\": rpc error: code = NotFound desc = could not find container \"915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470\": container with ID starting with 915ebe9fbf848dc071f53fcc1990c1aa7c86627c20f80970bd3771a2828ef470 not found: ID does not exist" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.573150 4565 scope.go:117] "RemoveContainer" containerID="facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d" Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.575180 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d\": container with ID starting with facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d not found: ID does not exist" containerID="facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.575223 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d"} err="failed to get container status \"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d\": rpc error: code = NotFound desc = could not find container \"facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d\": container with ID starting with facdb711870996a1763cf3843d9c3e85fc6e7ae9006397324a7e10343027243d not found: ID does not exist" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.575253 4565 scope.go:117] "RemoveContainer" containerID="d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371" Nov 25 09:20:36 crc kubenswrapper[4565]: E1125 09:20:36.575746 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371\": container with ID starting with d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371 not found: ID does not exist" containerID="d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.575768 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371"} err="failed to get container status \"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371\": rpc error: code = NotFound desc = could not find container \"d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371\": container with ID starting with d326b75bb39d841cbd1a72eade51295ceda61a7028f73dff3eb4cadd89615371 not found: ID does not exist" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668595 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668639 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668663 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668730 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgn4\" (UniqueName: \"kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668751 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668769 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.668797 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770252 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770402 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770438 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770468 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770565 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgn4\" (UniqueName: \"kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770596 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.770615 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.772284 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.772318 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.775632 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.776111 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.776223 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.776781 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.791503 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgn4\" (UniqueName: \"kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4\") pod \"ceilometer-0\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " pod="openstack/ceilometer-0" Nov 25 09:20:36 crc kubenswrapper[4565]: I1125 09:20:36.871227 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:20:37 crc kubenswrapper[4565]: I1125 09:20:37.110296 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa" path="/var/lib/kubelet/pods/ed3a56a2-3d22-4c46-aa23-b4cd9afaa1aa/volumes" Nov 25 09:20:37 crc kubenswrapper[4565]: I1125 09:20:37.306067 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:20:37 crc kubenswrapper[4565]: I1125 09:20:37.457030 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerStarted","Data":"5a354569d005a5f39c4fd2e984e148fbd4d6b800cdbd9ff090757686038a3582"} Nov 25 09:20:39 crc kubenswrapper[4565]: I1125 09:20:39.480982 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerStarted","Data":"6d82cf82b26a114380674af7822fb6d734b44eb4ec562cfc9cfbdca55dd3c85b"} Nov 25 09:20:40 crc kubenswrapper[4565]: I1125 09:20:40.497102 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerStarted","Data":"12b3252c0abe9e2d166361fd651af8edc3678b9b09e2943260d4f756d6acf679"} Nov 25 09:20:40 crc kubenswrapper[4565]: I1125 09:20:40.497513 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerStarted","Data":"8ef703925cb9fcec1e23636e3acab3202d372199a5892e393673dbb0d6e47e4f"} Nov 25 09:20:41 crc kubenswrapper[4565]: I1125 09:20:41.508125 4565 generic.go:334] "Generic (PLEG): container finished" podID="84c75e09-f927-412d-9a45-b122c13e711d" containerID="f93849e3715173c62cb1defb1877e385c23390def24eda663a153541e78ee958" exitCode=0 Nov 25 09:20:41 crc kubenswrapper[4565]: I1125 09:20:41.508693 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" event={"ID":"84c75e09-f927-412d-9a45-b122c13e711d","Type":"ContainerDied","Data":"f93849e3715173c62cb1defb1877e385c23390def24eda663a153541e78ee958"} Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.527015 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerStarted","Data":"c3ef2f64fe30e87e03aa02c99a2f75a529e3e0176c45204d950ec0ae4f985121"} Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.527392 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.556786 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.310679098 podStartE2EDuration="6.556754891s" podCreationTimestamp="2025-11-25 09:20:36 +0000 UTC" firstStartedPulling="2025-11-25 09:20:37.314549276 +0000 UTC m=+970.517044403" lastFinishedPulling="2025-11-25 09:20:41.560625058 +0000 UTC m=+974.763120196" observedRunningTime="2025-11-25 09:20:42.55336468 +0000 UTC m=+975.755859818" watchObservedRunningTime="2025-11-25 09:20:42.556754891 +0000 UTC m=+975.759250029" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.830847 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.920553 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts\") pod \"84c75e09-f927-412d-9a45-b122c13e711d\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.920740 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64gwm\" (UniqueName: \"kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm\") pod \"84c75e09-f927-412d-9a45-b122c13e711d\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.920840 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle\") pod \"84c75e09-f927-412d-9a45-b122c13e711d\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.920995 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data\") pod \"84c75e09-f927-412d-9a45-b122c13e711d\" (UID: \"84c75e09-f927-412d-9a45-b122c13e711d\") " Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.927473 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts" (OuterVolumeSpecName: "scripts") pod "84c75e09-f927-412d-9a45-b122c13e711d" (UID: "84c75e09-f927-412d-9a45-b122c13e711d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.934548 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm" (OuterVolumeSpecName: "kube-api-access-64gwm") pod "84c75e09-f927-412d-9a45-b122c13e711d" (UID: "84c75e09-f927-412d-9a45-b122c13e711d"). InnerVolumeSpecName "kube-api-access-64gwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.947959 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data" (OuterVolumeSpecName: "config-data") pod "84c75e09-f927-412d-9a45-b122c13e711d" (UID: "84c75e09-f927-412d-9a45-b122c13e711d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:42 crc kubenswrapper[4565]: I1125 09:20:42.998855 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84c75e09-f927-412d-9a45-b122c13e711d" (UID: "84c75e09-f927-412d-9a45-b122c13e711d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.023572 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64gwm\" (UniqueName: \"kubernetes.io/projected/84c75e09-f927-412d-9a45-b122c13e711d-kube-api-access-64gwm\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.023607 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.023619 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.023631 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c75e09-f927-412d-9a45-b122c13e711d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.536366 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.536500 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l7jv6" event={"ID":"84c75e09-f927-412d-9a45-b122c13e711d","Type":"ContainerDied","Data":"329f273d3039ca20ba00377e3f29b05f02d671f79179319d58ff6ae3437b5b72"} Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.537334 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329f273d3039ca20ba00377e3f29b05f02d671f79179319d58ff6ae3437b5b72" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.624696 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 09:20:43 crc kubenswrapper[4565]: E1125 09:20:43.625256 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c75e09-f927-412d-9a45-b122c13e711d" containerName="nova-cell0-conductor-db-sync" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.625283 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c75e09-f927-412d-9a45-b122c13e711d" containerName="nova-cell0-conductor-db-sync" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.625577 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c75e09-f927-412d-9a45-b122c13e711d" containerName="nova-cell0-conductor-db-sync" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.626252 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.629840 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.635918 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tjxjs" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.644279 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.735231 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.735280 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.735423 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blmv\" (UniqueName: \"kubernetes.io/projected/ba666b44-183c-4752-8f43-899a921da911-kube-api-access-4blmv\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.838144 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4blmv\" (UniqueName: \"kubernetes.io/projected/ba666b44-183c-4752-8f43-899a921da911-kube-api-access-4blmv\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.838592 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.838712 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.844185 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.846233 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba666b44-183c-4752-8f43-899a921da911-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.853560 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blmv\" (UniqueName: \"kubernetes.io/projected/ba666b44-183c-4752-8f43-899a921da911-kube-api-access-4blmv\") pod \"nova-cell0-conductor-0\" (UID: \"ba666b44-183c-4752-8f43-899a921da911\") " pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:43 crc kubenswrapper[4565]: I1125 09:20:43.948031 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:44 crc kubenswrapper[4565]: W1125 09:20:44.358682 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba666b44_183c_4752_8f43_899a921da911.slice/crio-02c86e40eb88b82b1378f5bfe3e06e968815dae3e45a9b1d306e0c69a9517f67 WatchSource:0}: Error finding container 02c86e40eb88b82b1378f5bfe3e06e968815dae3e45a9b1d306e0c69a9517f67: Status 404 returned error can't find the container with id 02c86e40eb88b82b1378f5bfe3e06e968815dae3e45a9b1d306e0c69a9517f67 Nov 25 09:20:44 crc kubenswrapper[4565]: I1125 09:20:44.362538 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 09:20:44 crc kubenswrapper[4565]: I1125 09:20:44.550571 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba666b44-183c-4752-8f43-899a921da911","Type":"ContainerStarted","Data":"1df8fedba78319bbd76c83f26ffe2b42832763636016c70da148690536332835"} Nov 25 09:20:44 crc kubenswrapper[4565]: I1125 09:20:44.550860 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba666b44-183c-4752-8f43-899a921da911","Type":"ContainerStarted","Data":"02c86e40eb88b82b1378f5bfe3e06e968815dae3e45a9b1d306e0c69a9517f67"} Nov 25 09:20:44 crc kubenswrapper[4565]: I1125 09:20:44.551989 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:44 crc kubenswrapper[4565]: I1125 09:20:44.570648 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.5706296960000001 podStartE2EDuration="1.570629696s" podCreationTimestamp="2025-11-25 09:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:20:44.570534206 +0000 UTC m=+977.773029344" watchObservedRunningTime="2025-11-25 09:20:44.570629696 +0000 UTC m=+977.773124835" Nov 25 09:20:53 crc kubenswrapper[4565]: I1125 09:20:53.971545 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.452085 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x72jl"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.453582 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.455560 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.457879 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.503686 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x72jl"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.544921 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjq2\" (UniqueName: \"kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.545019 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.545118 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.545159 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.651540 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.652867 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.654150 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.654344 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjq2\" (UniqueName: \"kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.654389 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.654522 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.661523 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.663453 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.665775 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.668446 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.677765 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjq2\" (UniqueName: \"kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2\") pod \"nova-cell0-cell-mapping-x72jl\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.698734 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.699955 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.704859 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.713215 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.720488 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.726515 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.729791 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.732753 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.757299 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.757374 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.757445 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.757690 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhvq\" (UniqueName: \"kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.778201 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.785165 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.863756 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhvq\" (UniqueName: \"kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864292 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864356 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864426 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864456 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864490 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864529 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864559 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh468\" (UniqueName: \"kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864580 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv8ts\" (UniqueName: \"kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.864605 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.881388 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.914395 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.939887 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhvq\" (UniqueName: \"kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.967382 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.970679 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.970746 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.970784 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.971304 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh468\" (UniqueName: \"kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.971353 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv8ts\" (UniqueName: \"kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.971423 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.971423 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " pod="openstack/nova-api-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.975266 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.983055 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.983389 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.984510 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.987635 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:54 crc kubenswrapper[4565]: I1125 09:20:54.995293 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.005354 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.008700 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh468\" (UniqueName: \"kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468\") pod \"nova-cell1-novncproxy-0\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.022066 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv8ts\" (UniqueName: \"kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts\") pod \"nova-scheduler-0\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " pod="openstack/nova-scheduler-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.051233 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.061215 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.061336 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.072366 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.073775 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.073911 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.074616 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.077557 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvlp\" (UniqueName: \"kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.087525 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.179570 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.179847 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.179882 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.179957 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.179994 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.180033 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvlp\" (UniqueName: \"kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.180077 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twkgv\" (UniqueName: \"kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.180172 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.180207 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.180757 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.187445 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.187475 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.194302 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvlp\" (UniqueName: \"kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp\") pod \"nova-metadata-0\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.220529 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.274815 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x72jl"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.284496 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.284555 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.284694 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.284797 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.284867 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twkgv\" (UniqueName: \"kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.285826 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.286517 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.286870 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.287029 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.299907 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twkgv\" (UniqueName: \"kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv\") pod \"dnsmasq-dns-69494d9f89-7hsws\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.310689 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.385402 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.556861 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.670761 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x72jl" event={"ID":"1097aa33-9ec1-4839-b8ea-faa176627408","Type":"ContainerStarted","Data":"96bd0e1c1f9ef62f5704df271e4e6b63d0f6f070498795bface69ad77120e221"} Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.670808 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x72jl" event={"ID":"1097aa33-9ec1-4839-b8ea-faa176627408","Type":"ContainerStarted","Data":"38f2d4ca6d9b81de82d2b58446d36e4e07834e3c09e1d09878f0405f53e0a25c"} Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.673159 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerStarted","Data":"5fae5afc9642f1207ce5e9eec9bdbbd2e0c042962e949d8c262ef7171f72747a"} Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.689637 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xcthc"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.690756 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.695765 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.695879 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.713855 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xcthc"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.714080 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x72jl" podStartSLOduration=1.714061026 podStartE2EDuration="1.714061026s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:20:55.702509492 +0000 UTC m=+988.905004630" watchObservedRunningTime="2025-11-25 09:20:55.714061026 +0000 UTC m=+988.916556164" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.748754 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.814320 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.815158 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.815414 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.815446 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7kkt\" (UniqueName: \"kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.836777 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.863533 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:20:55 crc kubenswrapper[4565]: W1125 09:20:55.879772 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27dea053_ce0e_4727_a310_785ccfde4424.slice/crio-9f3ef7cadee617692425dbe537942d8b6f4f7aa76d7e4ceac6a8ab900076a97f WatchSource:0}: Error finding container 9f3ef7cadee617692425dbe537942d8b6f4f7aa76d7e4ceac6a8ab900076a97f: Status 404 returned error can't find the container with id 9f3ef7cadee617692425dbe537942d8b6f4f7aa76d7e4ceac6a8ab900076a97f Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.917323 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.917467 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.917493 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7kkt\" (UniqueName: \"kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.917758 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.934533 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.935150 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.940617 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.942675 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7kkt\" (UniqueName: \"kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt\") pod \"nova-cell1-conductor-db-sync-xcthc\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:55 crc kubenswrapper[4565]: I1125 09:20:55.960132 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.105774 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.529741 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xcthc"] Nov 25 09:20:56 crc kubenswrapper[4565]: W1125 09:20:56.531482 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf895328_1ce8_477a_8939_2fe0442bfdb9.slice/crio-e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653 WatchSource:0}: Error finding container e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653: Status 404 returned error can't find the container with id e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653 Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.688741 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerStarted","Data":"c0c1389609e04c85588e2629a4835c9fa39e73a5fec6fa3d533fab359c0ce38e"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.690831 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"06be536c-6234-4539-ae36-7bd60a6d2097","Type":"ContainerStarted","Data":"9e9e1f89beb01fe35ccfdf55fec17cbe7861c026c860d3b7f5046a8f04d7a148"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.692473 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27dea053-ce0e-4727-a310-785ccfde4424","Type":"ContainerStarted","Data":"9f3ef7cadee617692425dbe537942d8b6f4f7aa76d7e4ceac6a8ab900076a97f"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.694444 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xcthc" event={"ID":"cf895328-1ce8-477a-8939-2fe0442bfdb9","Type":"ContainerStarted","Data":"7ae64e00d915cf79cc448d39974b79b42a1886dc7a0b39e952f38854e3b715ac"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.694472 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xcthc" event={"ID":"cf895328-1ce8-477a-8939-2fe0442bfdb9","Type":"ContainerStarted","Data":"e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.701778 4565 generic.go:334] "Generic (PLEG): container finished" podID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerID="19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971" exitCode=0 Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.702208 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" event={"ID":"46e9931e-fcd4-4c18-8c06-537d4c162c1f","Type":"ContainerDied","Data":"19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.702233 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" event={"ID":"46e9931e-fcd4-4c18-8c06-537d4c162c1f","Type":"ContainerStarted","Data":"0baf60a0c34977bcd007dc5f22242f78959fdbe5bb5a275b90a2de09292c82ca"} Nov 25 09:20:56 crc kubenswrapper[4565]: I1125 09:20:56.738274 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xcthc" podStartSLOduration=1.738249532 podStartE2EDuration="1.738249532s" podCreationTimestamp="2025-11-25 09:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:20:56.721111066 +0000 UTC m=+989.923606204" watchObservedRunningTime="2025-11-25 09:20:56.738249532 +0000 UTC m=+989.940744670" Nov 25 09:20:57 crc kubenswrapper[4565]: I1125 09:20:57.726146 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" event={"ID":"46e9931e-fcd4-4c18-8c06-537d4c162c1f","Type":"ContainerStarted","Data":"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c"} Nov 25 09:20:57 crc kubenswrapper[4565]: I1125 09:20:57.726647 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:20:57 crc kubenswrapper[4565]: I1125 09:20:57.750793 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" podStartSLOduration=3.750775513 podStartE2EDuration="3.750775513s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:20:57.746077116 +0000 UTC m=+990.948572255" watchObservedRunningTime="2025-11-25 09:20:57.750775513 +0000 UTC m=+990.953270651" Nov 25 09:20:58 crc kubenswrapper[4565]: I1125 09:20:58.370377 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:20:58 crc kubenswrapper[4565]: I1125 09:20:58.394170 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.777488 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerStarted","Data":"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.778256 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerStarted","Data":"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.782095 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerStarted","Data":"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.782160 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerStarted","Data":"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.782348 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-log" containerID="cri-o://c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" gracePeriod=30 Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.782393 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-metadata" containerID="cri-o://447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" gracePeriod=30 Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.792304 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"06be536c-6234-4539-ae36-7bd60a6d2097","Type":"ContainerStarted","Data":"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.792459 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="06be536c-6234-4539-ae36-7bd60a6d2097" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784" gracePeriod=30 Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.796063 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27dea053-ce0e-4727-a310-785ccfde4424","Type":"ContainerStarted","Data":"c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659"} Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.831272 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7696827429999997 podStartE2EDuration="6.831236071s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="2025-11-25 09:20:55.569838017 +0000 UTC m=+988.772333155" lastFinishedPulling="2025-11-25 09:20:59.631391346 +0000 UTC m=+992.833886483" observedRunningTime="2025-11-25 09:21:00.808739115 +0000 UTC m=+994.011234252" watchObservedRunningTime="2025-11-25 09:21:00.831236071 +0000 UTC m=+994.033731199" Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.836527 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.183061598 podStartE2EDuration="6.836515372s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="2025-11-25 09:20:55.966157065 +0000 UTC m=+989.168652203" lastFinishedPulling="2025-11-25 09:20:59.619610849 +0000 UTC m=+992.822105977" observedRunningTime="2025-11-25 09:21:00.828080022 +0000 UTC m=+994.030575161" watchObservedRunningTime="2025-11-25 09:21:00.836515372 +0000 UTC m=+994.039010510" Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.851001 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.999383505 podStartE2EDuration="6.850980909s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="2025-11-25 09:20:55.769100685 +0000 UTC m=+988.971595823" lastFinishedPulling="2025-11-25 09:20:59.620698088 +0000 UTC m=+992.823193227" observedRunningTime="2025-11-25 09:21:00.847773224 +0000 UTC m=+994.050268362" watchObservedRunningTime="2025-11-25 09:21:00.850980909 +0000 UTC m=+994.053476048" Nov 25 09:21:00 crc kubenswrapper[4565]: I1125 09:21:00.866574 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.131913424 podStartE2EDuration="6.866561339s" podCreationTimestamp="2025-11-25 09:20:54 +0000 UTC" firstStartedPulling="2025-11-25 09:20:55.88565022 +0000 UTC m=+989.088145358" lastFinishedPulling="2025-11-25 09:20:59.620298134 +0000 UTC m=+992.822793273" observedRunningTime="2025-11-25 09:21:00.862287372 +0000 UTC m=+994.064782510" watchObservedRunningTime="2025-11-25 09:21:00.866561339 +0000 UTC m=+994.069056477" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.298097 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.362076 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle\") pod \"533752d5-2780-4d49-add9-847fe3232f5d\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.362229 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data\") pod \"533752d5-2780-4d49-add9-847fe3232f5d\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.362301 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpvlp\" (UniqueName: \"kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp\") pod \"533752d5-2780-4d49-add9-847fe3232f5d\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.362481 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs\") pod \"533752d5-2780-4d49-add9-847fe3232f5d\" (UID: \"533752d5-2780-4d49-add9-847fe3232f5d\") " Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.363731 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs" (OuterVolumeSpecName: "logs") pod "533752d5-2780-4d49-add9-847fe3232f5d" (UID: "533752d5-2780-4d49-add9-847fe3232f5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.368560 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp" (OuterVolumeSpecName: "kube-api-access-jpvlp") pod "533752d5-2780-4d49-add9-847fe3232f5d" (UID: "533752d5-2780-4d49-add9-847fe3232f5d"). InnerVolumeSpecName "kube-api-access-jpvlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.423087 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data" (OuterVolumeSpecName: "config-data") pod "533752d5-2780-4d49-add9-847fe3232f5d" (UID: "533752d5-2780-4d49-add9-847fe3232f5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.431014 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "533752d5-2780-4d49-add9-847fe3232f5d" (UID: "533752d5-2780-4d49-add9-847fe3232f5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.466631 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533752d5-2780-4d49-add9-847fe3232f5d-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.466663 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.466673 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533752d5-2780-4d49-add9-847fe3232f5d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.466682 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpvlp\" (UniqueName: \"kubernetes.io/projected/533752d5-2780-4d49-add9-847fe3232f5d-kube-api-access-jpvlp\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.805095 4565 generic.go:334] "Generic (PLEG): container finished" podID="cf895328-1ce8-477a-8939-2fe0442bfdb9" containerID="7ae64e00d915cf79cc448d39974b79b42a1886dc7a0b39e952f38854e3b715ac" exitCode=0 Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.805165 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xcthc" event={"ID":"cf895328-1ce8-477a-8939-2fe0442bfdb9","Type":"ContainerDied","Data":"7ae64e00d915cf79cc448d39974b79b42a1886dc7a0b39e952f38854e3b715ac"} Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807003 4565 generic.go:334] "Generic (PLEG): container finished" podID="533752d5-2780-4d49-add9-847fe3232f5d" containerID="447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" exitCode=0 Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807024 4565 generic.go:334] "Generic (PLEG): container finished" podID="533752d5-2780-4d49-add9-847fe3232f5d" containerID="c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" exitCode=143 Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807059 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerDied","Data":"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264"} Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807127 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerDied","Data":"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275"} Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807149 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533752d5-2780-4d49-add9-847fe3232f5d","Type":"ContainerDied","Data":"c0c1389609e04c85588e2629a4835c9fa39e73a5fec6fa3d533fab359c0ce38e"} Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807072 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.807168 4565 scope.go:117] "RemoveContainer" containerID="447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.832780 4565 scope.go:117] "RemoveContainer" containerID="c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.847229 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.849761 4565 scope.go:117] "RemoveContainer" containerID="447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" Nov 25 09:21:01 crc kubenswrapper[4565]: E1125 09:21:01.850171 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264\": container with ID starting with 447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264 not found: ID does not exist" containerID="447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850226 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264"} err="failed to get container status \"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264\": rpc error: code = NotFound desc = could not find container \"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264\": container with ID starting with 447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264 not found: ID does not exist" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850260 4565 scope.go:117] "RemoveContainer" containerID="c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" Nov 25 09:21:01 crc kubenswrapper[4565]: E1125 09:21:01.850582 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275\": container with ID starting with c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275 not found: ID does not exist" containerID="c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850623 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275"} err="failed to get container status \"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275\": rpc error: code = NotFound desc = could not find container \"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275\": container with ID starting with c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275 not found: ID does not exist" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850651 4565 scope.go:117] "RemoveContainer" containerID="447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850883 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264"} err="failed to get container status \"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264\": rpc error: code = NotFound desc = could not find container \"447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264\": container with ID starting with 447205ff7e7d33b66dd35f4252ff975ee5e50dca8b8d59996641ec387fe5b264 not found: ID does not exist" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.850913 4565 scope.go:117] "RemoveContainer" containerID="c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.851161 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275"} err="failed to get container status \"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275\": rpc error: code = NotFound desc = could not find container \"c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275\": container with ID starting with c2dc5f92bfab91356af6c3d02aa91fb6c46990aa05e29fee2d17e263180fe275 not found: ID does not exist" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.855409 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.870405 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:01 crc kubenswrapper[4565]: E1125 09:21:01.870814 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-log" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.870834 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-log" Nov 25 09:21:01 crc kubenswrapper[4565]: E1125 09:21:01.870854 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-metadata" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.870860 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-metadata" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.871089 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-log" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.871114 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="533752d5-2780-4d49-add9-847fe3232f5d" containerName="nova-metadata-metadata" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.872095 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.874387 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.874496 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.928193 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.980334 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.980635 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.980693 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9gt\" (UniqueName: \"kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.980756 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:01 crc kubenswrapper[4565]: I1125 09:21:01.980792 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.083263 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.083565 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.083618 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9gt\" (UniqueName: \"kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.083702 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.083738 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.084648 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.088483 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.091099 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.092957 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.114456 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9gt\" (UniqueName: \"kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt\") pod \"nova-metadata-0\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.186670 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.665030 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:02 crc kubenswrapper[4565]: W1125 09:21:02.665611 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb71bc132_0cb8_4caf_994d_1f9723b50949.slice/crio-dfc083176d274e3e3dfbecc7a29649dc8023e071421214f24a20f03ba29fbb66 WatchSource:0}: Error finding container dfc083176d274e3e3dfbecc7a29649dc8023e071421214f24a20f03ba29fbb66: Status 404 returned error can't find the container with id dfc083176d274e3e3dfbecc7a29649dc8023e071421214f24a20f03ba29fbb66 Nov 25 09:21:02 crc kubenswrapper[4565]: I1125 09:21:02.818910 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerStarted","Data":"dfc083176d274e3e3dfbecc7a29649dc8023e071421214f24a20f03ba29fbb66"} Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.090440 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.110239 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data\") pod \"cf895328-1ce8-477a-8939-2fe0442bfdb9\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.110646 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle\") pod \"cf895328-1ce8-477a-8939-2fe0442bfdb9\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.110869 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7kkt\" (UniqueName: \"kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt\") pod \"cf895328-1ce8-477a-8939-2fe0442bfdb9\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.111004 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts\") pod \"cf895328-1ce8-477a-8939-2fe0442bfdb9\" (UID: \"cf895328-1ce8-477a-8939-2fe0442bfdb9\") " Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.126154 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt" (OuterVolumeSpecName: "kube-api-access-f7kkt") pod "cf895328-1ce8-477a-8939-2fe0442bfdb9" (UID: "cf895328-1ce8-477a-8939-2fe0442bfdb9"). InnerVolumeSpecName "kube-api-access-f7kkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.134374 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533752d5-2780-4d49-add9-847fe3232f5d" path="/var/lib/kubelet/pods/533752d5-2780-4d49-add9-847fe3232f5d/volumes" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.142677 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data" (OuterVolumeSpecName: "config-data") pod "cf895328-1ce8-477a-8939-2fe0442bfdb9" (UID: "cf895328-1ce8-477a-8939-2fe0442bfdb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.155053 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf895328-1ce8-477a-8939-2fe0442bfdb9" (UID: "cf895328-1ce8-477a-8939-2fe0442bfdb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.155074 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts" (OuterVolumeSpecName: "scripts") pod "cf895328-1ce8-477a-8939-2fe0442bfdb9" (UID: "cf895328-1ce8-477a-8939-2fe0442bfdb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.214506 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.214551 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.214563 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf895328-1ce8-477a-8939-2fe0442bfdb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.214576 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7kkt\" (UniqueName: \"kubernetes.io/projected/cf895328-1ce8-477a-8939-2fe0442bfdb9-kube-api-access-f7kkt\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.833406 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xcthc" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.833393 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xcthc" event={"ID":"cf895328-1ce8-477a-8939-2fe0442bfdb9","Type":"ContainerDied","Data":"e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653"} Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.833885 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e89e18f407c9538413ac8bdbff57a9dbb723df076700f1cde549cc9cda98b653" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.835601 4565 generic.go:334] "Generic (PLEG): container finished" podID="1097aa33-9ec1-4839-b8ea-faa176627408" containerID="96bd0e1c1f9ef62f5704df271e4e6b63d0f6f070498795bface69ad77120e221" exitCode=0 Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.835702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x72jl" event={"ID":"1097aa33-9ec1-4839-b8ea-faa176627408","Type":"ContainerDied","Data":"96bd0e1c1f9ef62f5704df271e4e6b63d0f6f070498795bface69ad77120e221"} Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.838377 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerStarted","Data":"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64"} Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.838411 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerStarted","Data":"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e"} Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.904258 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9042383000000003 podStartE2EDuration="2.9042383s" podCreationTimestamp="2025-11-25 09:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:03.889224379 +0000 UTC m=+997.091719517" watchObservedRunningTime="2025-11-25 09:21:03.9042383 +0000 UTC m=+997.106733438" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.906232 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 09:21:03 crc kubenswrapper[4565]: E1125 09:21:03.906696 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf895328-1ce8-477a-8939-2fe0442bfdb9" containerName="nova-cell1-conductor-db-sync" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.906720 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf895328-1ce8-477a-8939-2fe0442bfdb9" containerName="nova-cell1-conductor-db-sync" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.906910 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf895328-1ce8-477a-8939-2fe0442bfdb9" containerName="nova-cell1-conductor-db-sync" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.914621 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.914711 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.928700 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.928794 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjcj6\" (UniqueName: \"kubernetes.io/projected/26443d78-c2f9-4e62-9f77-69dbca9848f0-kube-api-access-kjcj6\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.928849 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:03 crc kubenswrapper[4565]: I1125 09:21:03.929391 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.030351 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjcj6\" (UniqueName: \"kubernetes.io/projected/26443d78-c2f9-4e62-9f77-69dbca9848f0-kube-api-access-kjcj6\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.030457 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.030672 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.040394 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.043555 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26443d78-c2f9-4e62-9f77-69dbca9848f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.043955 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjcj6\" (UniqueName: \"kubernetes.io/projected/26443d78-c2f9-4e62-9f77-69dbca9848f0-kube-api-access-kjcj6\") pod \"nova-cell1-conductor-0\" (UID: \"26443d78-c2f9-4e62-9f77-69dbca9848f0\") " pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.236239 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.664378 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 09:21:04 crc kubenswrapper[4565]: W1125 09:21:04.666264 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26443d78_c2f9_4e62_9f77_69dbca9848f0.slice/crio-ce767e6607f7c15bd0aa886e0e9f22689a9de32c809c219412e0a281b12736d9 WatchSource:0}: Error finding container ce767e6607f7c15bd0aa886e0e9f22689a9de32c809c219412e0a281b12736d9: Status 404 returned error can't find the container with id ce767e6607f7c15bd0aa886e0e9f22689a9de32c809c219412e0a281b12736d9 Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.852769 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"26443d78-c2f9-4e62-9f77-69dbca9848f0","Type":"ContainerStarted","Data":"daca2d21deb15a0555ece0215533dffdb517abdcca64c4a2479a80ba083e58d8"} Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.852983 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.853031 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"26443d78-c2f9-4e62-9f77-69dbca9848f0","Type":"ContainerStarted","Data":"ce767e6607f7c15bd0aa886e0e9f22689a9de32c809c219412e0a281b12736d9"} Nov 25 09:21:04 crc kubenswrapper[4565]: I1125 09:21:04.870235 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.870217932 podStartE2EDuration="1.870217932s" podCreationTimestamp="2025-11-25 09:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:04.869508114 +0000 UTC m=+998.072003253" watchObservedRunningTime="2025-11-25 09:21:04.870217932 +0000 UTC m=+998.072713071" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.073589 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.073978 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.088517 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.188714 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.221249 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.223181 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.250082 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.258732 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle\") pod \"1097aa33-9ec1-4839-b8ea-faa176627408\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.296216 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1097aa33-9ec1-4839-b8ea-faa176627408" (UID: "1097aa33-9ec1-4839-b8ea-faa176627408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.361325 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts\") pod \"1097aa33-9ec1-4839-b8ea-faa176627408\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.361423 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data\") pod \"1097aa33-9ec1-4839-b8ea-faa176627408\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.361461 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsjq2\" (UniqueName: \"kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2\") pod \"1097aa33-9ec1-4839-b8ea-faa176627408\" (UID: \"1097aa33-9ec1-4839-b8ea-faa176627408\") " Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.361767 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.364988 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts" (OuterVolumeSpecName: "scripts") pod "1097aa33-9ec1-4839-b8ea-faa176627408" (UID: "1097aa33-9ec1-4839-b8ea-faa176627408"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.365557 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2" (OuterVolumeSpecName: "kube-api-access-fsjq2") pod "1097aa33-9ec1-4839-b8ea-faa176627408" (UID: "1097aa33-9ec1-4839-b8ea-faa176627408"). InnerVolumeSpecName "kube-api-access-fsjq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.387392 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.387805 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data" (OuterVolumeSpecName: "config-data") pod "1097aa33-9ec1-4839-b8ea-faa176627408" (UID: "1097aa33-9ec1-4839-b8ea-faa176627408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.437826 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.438103 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-775457b975-zp8vm" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="dnsmasq-dns" containerID="cri-o://b6e65b11377ed310c9c5c120c2ea0020e10a78cd66e546f1cf3e393fa6cf51dc" gracePeriod=10 Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.468737 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsjq2\" (UniqueName: \"kubernetes.io/projected/1097aa33-9ec1-4839-b8ea-faa176627408-kube-api-access-fsjq2\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.468770 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.468781 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097aa33-9ec1-4839-b8ea-faa176627408-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.878787 4565 generic.go:334] "Generic (PLEG): container finished" podID="5152cfbe-d229-461a-b9f0-07370920821b" containerID="b6e65b11377ed310c9c5c120c2ea0020e10a78cd66e546f1cf3e393fa6cf51dc" exitCode=0 Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.878880 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775457b975-zp8vm" event={"ID":"5152cfbe-d229-461a-b9f0-07370920821b","Type":"ContainerDied","Data":"b6e65b11377ed310c9c5c120c2ea0020e10a78cd66e546f1cf3e393fa6cf51dc"} Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.891193 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x72jl" event={"ID":"1097aa33-9ec1-4839-b8ea-faa176627408","Type":"ContainerDied","Data":"38f2d4ca6d9b81de82d2b58446d36e4e07834e3c09e1d09878f0405f53e0a25c"} Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.891230 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f2d4ca6d9b81de82d2b58446d36e4e07834e3c09e1d09878f0405f53e0a25c" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.891445 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x72jl" Nov 25 09:21:05 crc kubenswrapper[4565]: I1125 09:21:05.968057 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.015666 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.020385 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.020529 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-log" containerID="cri-o://c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e" gracePeriod=30 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.020602 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-api" containerID="cri-o://8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4" gracePeriod=30 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.024327 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.030120 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": EOF" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.037036 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.037201 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-log" containerID="cri-o://ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" gracePeriod=30 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.037307 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-metadata" containerID="cri-o://34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" gracePeriod=30 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.057131 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": EOF" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.087885 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twwxh\" (UniqueName: \"kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh\") pod \"5152cfbe-d229-461a-b9f0-07370920821b\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.087968 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config\") pod \"5152cfbe-d229-461a-b9f0-07370920821b\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.088020 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc\") pod \"5152cfbe-d229-461a-b9f0-07370920821b\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.088088 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb\") pod \"5152cfbe-d229-461a-b9f0-07370920821b\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.088216 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb\") pod \"5152cfbe-d229-461a-b9f0-07370920821b\" (UID: \"5152cfbe-d229-461a-b9f0-07370920821b\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.106123 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh" (OuterVolumeSpecName: "kube-api-access-twwxh") pod "5152cfbe-d229-461a-b9f0-07370920821b" (UID: "5152cfbe-d229-461a-b9f0-07370920821b"). InnerVolumeSpecName "kube-api-access-twwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.181655 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5152cfbe-d229-461a-b9f0-07370920821b" (UID: "5152cfbe-d229-461a-b9f0-07370920821b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.182115 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5152cfbe-d229-461a-b9f0-07370920821b" (UID: "5152cfbe-d229-461a-b9f0-07370920821b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.183855 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config" (OuterVolumeSpecName: "config") pod "5152cfbe-d229-461a-b9f0-07370920821b" (UID: "5152cfbe-d229-461a-b9f0-07370920821b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.191034 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twwxh\" (UniqueName: \"kubernetes.io/projected/5152cfbe-d229-461a-b9f0-07370920821b-kube-api-access-twwxh\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.191059 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.191070 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.191081 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.198269 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5152cfbe-d229-461a-b9f0-07370920821b" (UID: "5152cfbe-d229-461a-b9f0-07370920821b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.292635 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5152cfbe-d229-461a-b9f0-07370920821b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.872851 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.901991 4565 generic.go:334] "Generic (PLEG): container finished" podID="751b5095-237a-402c-b339-1bb87853d797" containerID="c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e" exitCode=143 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.902298 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerDied","Data":"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e"} Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.903903 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9gt\" (UniqueName: \"kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt\") pod \"b71bc132-0cb8-4caf-994d-1f9723b50949\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.903984 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle\") pod \"b71bc132-0cb8-4caf-994d-1f9723b50949\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.904151 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775457b975-zp8vm" event={"ID":"5152cfbe-d229-461a-b9f0-07370920821b","Type":"ContainerDied","Data":"2a4e509f1b12160133709ffeb928c9bff3db23924301f69fb6cbe9b5edd68623"} Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.904270 4565 scope.go:117] "RemoveContainer" containerID="b6e65b11377ed310c9c5c120c2ea0020e10a78cd66e546f1cf3e393fa6cf51dc" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.904496 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775457b975-zp8vm" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.904215 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs\") pod \"b71bc132-0cb8-4caf-994d-1f9723b50949\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.912116 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data\") pod \"b71bc132-0cb8-4caf-994d-1f9723b50949\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.912349 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs\") pod \"b71bc132-0cb8-4caf-994d-1f9723b50949\" (UID: \"b71bc132-0cb8-4caf-994d-1f9723b50949\") " Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.913571 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs" (OuterVolumeSpecName: "logs") pod "b71bc132-0cb8-4caf-994d-1f9723b50949" (UID: "b71bc132-0cb8-4caf-994d-1f9723b50949"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.922193 4565 generic.go:334] "Generic (PLEG): container finished" podID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerID="34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" exitCode=0 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.922312 4565 generic.go:334] "Generic (PLEG): container finished" podID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerID="ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" exitCode=143 Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.923279 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.924009 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerDied","Data":"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64"} Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.924125 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerDied","Data":"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e"} Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.924210 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b71bc132-0cb8-4caf-994d-1f9723b50949","Type":"ContainerDied","Data":"dfc083176d274e3e3dfbecc7a29649dc8023e071421214f24a20f03ba29fbb66"} Nov 25 09:21:06 crc kubenswrapper[4565]: I1125 09:21:06.944595 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt" (OuterVolumeSpecName: "kube-api-access-vf9gt") pod "b71bc132-0cb8-4caf-994d-1f9723b50949" (UID: "b71bc132-0cb8-4caf-994d-1f9723b50949"). InnerVolumeSpecName "kube-api-access-vf9gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.010565 4565 scope.go:117] "RemoveContainer" containerID="705315a3da171a0197161b151d60aa9f71ed01f35ad377449880462353da093e" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.010824 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data" (OuterVolumeSpecName: "config-data") pod "b71bc132-0cb8-4caf-994d-1f9723b50949" (UID: "b71bc132-0cb8-4caf-994d-1f9723b50949"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.017665 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.019620 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71bc132-0cb8-4caf-994d-1f9723b50949-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.019702 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9gt\" (UniqueName: \"kubernetes.io/projected/b71bc132-0cb8-4caf-994d-1f9723b50949-kube-api-access-vf9gt\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.017838 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b71bc132-0cb8-4caf-994d-1f9723b50949" (UID: "b71bc132-0cb8-4caf-994d-1f9723b50949"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.040539 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.042102 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b71bc132-0cb8-4caf-994d-1f9723b50949" (UID: "b71bc132-0cb8-4caf-994d-1f9723b50949"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.079176 4565 scope.go:117] "RemoveContainer" containerID="34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.089675 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775457b975-zp8vm"] Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.123186 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.123325 4565 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b71bc132-0cb8-4caf-994d-1f9723b50949-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.152456 4565 scope.go:117] "RemoveContainer" containerID="ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.168512 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5152cfbe-d229-461a-b9f0-07370920821b" path="/var/lib/kubelet/pods/5152cfbe-d229-461a-b9f0-07370920821b/volumes" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.169791 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.212160 4565 scope.go:117] "RemoveContainer" containerID="34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.215288 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64\": container with ID starting with 34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64 not found: ID does not exist" containerID="34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.215509 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64"} err="failed to get container status \"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64\": rpc error: code = NotFound desc = could not find container \"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64\": container with ID starting with 34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64 not found: ID does not exist" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.215590 4565 scope.go:117] "RemoveContainer" containerID="ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.245496 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e\": container with ID starting with ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e not found: ID does not exist" containerID="ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.248032 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e"} err="failed to get container status \"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e\": rpc error: code = NotFound desc = could not find container \"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e\": container with ID starting with ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e not found: ID does not exist" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.248063 4565 scope.go:117] "RemoveContainer" containerID="34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.253082 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64"} err="failed to get container status \"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64\": rpc error: code = NotFound desc = could not find container \"34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64\": container with ID starting with 34d606846ecf675547b9a68acf35a0c65eda9fbf8ead7f3c60f2264bacbcbd64 not found: ID does not exist" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.253124 4565 scope.go:117] "RemoveContainer" containerID="ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.254273 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e"} err="failed to get container status \"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e\": rpc error: code = NotFound desc = could not find container \"ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e\": container with ID starting with ade1a7ca6e7c2f7585a4a22e281bd93c782df92c346701ba5c34e6374a1faa9e not found: ID does not exist" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.321211 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.329941 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.357992 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.358387 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-log" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.359796 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-log" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.359818 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="init" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.359826 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="init" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.359839 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-metadata" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.359847 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-metadata" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.359860 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1097aa33-9ec1-4839-b8ea-faa176627408" containerName="nova-manage" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.359865 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1097aa33-9ec1-4839-b8ea-faa176627408" containerName="nova-manage" Nov 25 09:21:07 crc kubenswrapper[4565]: E1125 09:21:07.359883 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="dnsmasq-dns" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.359890 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="dnsmasq-dns" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.360088 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-metadata" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.360106 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" containerName="nova-metadata-log" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.360116 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="1097aa33-9ec1-4839-b8ea-faa176627408" containerName="nova-manage" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.360123 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5152cfbe-d229-461a-b9f0-07370920821b" containerName="dnsmasq-dns" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.362131 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.366449 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.373312 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.378133 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.537243 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.537532 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2s5\" (UniqueName: \"kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.537638 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.537739 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.537857 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.641104 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.641371 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2s5\" (UniqueName: \"kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.641467 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.641550 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.641610 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.642013 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.647051 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.647734 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.647735 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.658854 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2s5\" (UniqueName: \"kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5\") pod \"nova-metadata-0\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.685627 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.942751 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="27dea053-ce0e-4727-a310-785ccfde4424" containerName="nova-scheduler-scheduler" containerID="cri-o://c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" gracePeriod=30 Nov 25 09:21:07 crc kubenswrapper[4565]: I1125 09:21:07.964131 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:08 crc kubenswrapper[4565]: I1125 09:21:08.956640 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerStarted","Data":"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97"} Nov 25 09:21:08 crc kubenswrapper[4565]: I1125 09:21:08.957310 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerStarted","Data":"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0"} Nov 25 09:21:08 crc kubenswrapper[4565]: I1125 09:21:08.957336 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerStarted","Data":"c491dbdaa9a02316faa7220ca4f090009851479f1722b0f62f607733ce162cb7"} Nov 25 09:21:08 crc kubenswrapper[4565]: I1125 09:21:08.985857 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.985819771 podStartE2EDuration="1.985819771s" podCreationTimestamp="2025-11-25 09:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:08.975313447 +0000 UTC m=+1002.177808585" watchObservedRunningTime="2025-11-25 09:21:08.985819771 +0000 UTC m=+1002.188314900" Nov 25 09:21:09 crc kubenswrapper[4565]: I1125 09:21:09.107516 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71bc132-0cb8-4caf-994d-1f9723b50949" path="/var/lib/kubelet/pods/b71bc132-0cb8-4caf-994d-1f9723b50949/volumes" Nov 25 09:21:09 crc kubenswrapper[4565]: I1125 09:21:09.265991 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 25 09:21:10 crc kubenswrapper[4565]: E1125 09:21:10.232290 4565 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 09:21:10 crc kubenswrapper[4565]: E1125 09:21:10.236474 4565 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.239635 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.239868 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0fecdc60-114f-4981-9386-9814aab46033" containerName="kube-state-metrics" containerID="cri-o://fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37" gracePeriod=30 Nov 25 09:21:10 crc kubenswrapper[4565]: E1125 09:21:10.242896 4565 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 09:21:10 crc kubenswrapper[4565]: E1125 09:21:10.242968 4565 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="27dea053-ce0e-4727-a310-785ccfde4424" containerName="nova-scheduler-scheduler" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.409704 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="0fecdc60-114f-4981-9386-9814aab46033" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": dial tcp 10.217.0.104:8081: connect: connection refused" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.737292 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.918356 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2kcl\" (UniqueName: \"kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl\") pod \"0fecdc60-114f-4981-9386-9814aab46033\" (UID: \"0fecdc60-114f-4981-9386-9814aab46033\") " Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.927673 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl" (OuterVolumeSpecName: "kube-api-access-n2kcl") pod "0fecdc60-114f-4981-9386-9814aab46033" (UID: "0fecdc60-114f-4981-9386-9814aab46033"). InnerVolumeSpecName "kube-api-access-n2kcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.977961 4565 generic.go:334] "Generic (PLEG): container finished" podID="0fecdc60-114f-4981-9386-9814aab46033" containerID="fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37" exitCode=2 Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.978049 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0fecdc60-114f-4981-9386-9814aab46033","Type":"ContainerDied","Data":"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37"} Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.978089 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.978122 4565 scope.go:117] "RemoveContainer" containerID="fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37" Nov 25 09:21:10 crc kubenswrapper[4565]: I1125 09:21:10.978102 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0fecdc60-114f-4981-9386-9814aab46033","Type":"ContainerDied","Data":"283b70f639a67c8298e3cc68d96f59b461e30f842b9f65b52ae89c0002644e24"} Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.010921 4565 scope.go:117] "RemoveContainer" containerID="fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37" Nov 25 09:21:11 crc kubenswrapper[4565]: E1125 09:21:11.011607 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37\": container with ID starting with fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37 not found: ID does not exist" containerID="fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.011664 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37"} err="failed to get container status \"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37\": rpc error: code = NotFound desc = could not find container \"fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37\": container with ID starting with fbbeb8419880a1d965c8d8dcb6b2db9863b40fdc3f6fce3eaa7873977ab7bc37 not found: ID does not exist" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.016903 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.022879 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2kcl\" (UniqueName: \"kubernetes.io/projected/0fecdc60-114f-4981-9386-9814aab46033-kube-api-access-n2kcl\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.024434 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.045197 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: E1125 09:21:11.045728 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fecdc60-114f-4981-9386-9814aab46033" containerName="kube-state-metrics" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.045753 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fecdc60-114f-4981-9386-9814aab46033" containerName="kube-state-metrics" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.046088 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fecdc60-114f-4981-9386-9814aab46033" containerName="kube-state-metrics" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.046862 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.049615 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.058425 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.089540 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.108572 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fecdc60-114f-4981-9386-9814aab46033" path="/var/lib/kubelet/pods/0fecdc60-114f-4981-9386-9814aab46033/volumes" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.228103 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.228166 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdd6p\" (UniqueName: \"kubernetes.io/projected/5275621d-5c51-4586-85f2-e0e24cb32266-kube-api-access-tdd6p\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.228544 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.228683 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.331425 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.331630 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.331666 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdd6p\" (UniqueName: \"kubernetes.io/projected/5275621d-5c51-4586-85f2-e0e24cb32266-kube-api-access-tdd6p\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.331779 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.337031 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.340594 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.347984 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5275621d-5c51-4586-85f2-e0e24cb32266-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.357511 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdd6p\" (UniqueName: \"kubernetes.io/projected/5275621d-5c51-4586-85f2-e0e24cb32266-kube-api-access-tdd6p\") pod \"kube-state-metrics-0\" (UID: \"5275621d-5c51-4586-85f2-e0e24cb32266\") " pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.366866 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.508791 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.509433 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-central-agent" containerID="cri-o://6d82cf82b26a114380674af7822fb6d734b44eb4ec562cfc9cfbdca55dd3c85b" gracePeriod=30 Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.509761 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="sg-core" containerID="cri-o://12b3252c0abe9e2d166361fd651af8edc3678b9b09e2943260d4f756d6acf679" gracePeriod=30 Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.509765 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="proxy-httpd" containerID="cri-o://c3ef2f64fe30e87e03aa02c99a2f75a529e3e0176c45204d950ec0ae4f985121" gracePeriod=30 Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.509794 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-notification-agent" containerID="cri-o://8ef703925cb9fcec1e23636e3acab3202d372199a5892e393673dbb0d6e47e4f" gracePeriod=30 Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.824869 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.990381 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerStarted","Data":"c539c03c20ad3ccab7247d6f452331421258ee98ff01f9a5a13af8e6484f74b0"} Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.993394 4565 generic.go:334] "Generic (PLEG): container finished" podID="27dea053-ce0e-4727-a310-785ccfde4424" containerID="c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" exitCode=0 Nov 25 09:21:11 crc kubenswrapper[4565]: I1125 09:21:11.993453 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27dea053-ce0e-4727-a310-785ccfde4424","Type":"ContainerDied","Data":"c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659"} Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003357 4565 generic.go:334] "Generic (PLEG): container finished" podID="93717c97-0833-46cd-bb1b-062e65667195" containerID="c3ef2f64fe30e87e03aa02c99a2f75a529e3e0176c45204d950ec0ae4f985121" exitCode=0 Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003404 4565 generic.go:334] "Generic (PLEG): container finished" podID="93717c97-0833-46cd-bb1b-062e65667195" containerID="12b3252c0abe9e2d166361fd651af8edc3678b9b09e2943260d4f756d6acf679" exitCode=2 Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003414 4565 generic.go:334] "Generic (PLEG): container finished" podID="93717c97-0833-46cd-bb1b-062e65667195" containerID="6d82cf82b26a114380674af7822fb6d734b44eb4ec562cfc9cfbdca55dd3c85b" exitCode=0 Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003477 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerDied","Data":"c3ef2f64fe30e87e03aa02c99a2f75a529e3e0176c45204d950ec0ae4f985121"} Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003518 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerDied","Data":"12b3252c0abe9e2d166361fd651af8edc3678b9b09e2943260d4f756d6acf679"} Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.003531 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerDied","Data":"6d82cf82b26a114380674af7822fb6d734b44eb4ec562cfc9cfbdca55dd3c85b"} Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.278991 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.365992 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv8ts\" (UniqueName: \"kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts\") pod \"27dea053-ce0e-4727-a310-785ccfde4424\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.366104 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle\") pod \"27dea053-ce0e-4727-a310-785ccfde4424\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.366278 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data\") pod \"27dea053-ce0e-4727-a310-785ccfde4424\" (UID: \"27dea053-ce0e-4727-a310-785ccfde4424\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.372214 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts" (OuterVolumeSpecName: "kube-api-access-vv8ts") pod "27dea053-ce0e-4727-a310-785ccfde4424" (UID: "27dea053-ce0e-4727-a310-785ccfde4424"). InnerVolumeSpecName "kube-api-access-vv8ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.396465 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data" (OuterVolumeSpecName: "config-data") pod "27dea053-ce0e-4727-a310-785ccfde4424" (UID: "27dea053-ce0e-4727-a310-785ccfde4424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.396708 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27dea053-ce0e-4727-a310-785ccfde4424" (UID: "27dea053-ce0e-4727-a310-785ccfde4424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.474373 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv8ts\" (UniqueName: \"kubernetes.io/projected/27dea053-ce0e-4727-a310-785ccfde4424-kube-api-access-vv8ts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.474435 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.474451 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27dea053-ce0e-4727-a310-785ccfde4424-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.686474 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.686905 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.895154 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.987999 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs\") pod \"751b5095-237a-402c-b339-1bb87853d797\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.988145 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle\") pod \"751b5095-237a-402c-b339-1bb87853d797\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.988253 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data\") pod \"751b5095-237a-402c-b339-1bb87853d797\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.988326 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hhvq\" (UniqueName: \"kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq\") pod \"751b5095-237a-402c-b339-1bb87853d797\" (UID: \"751b5095-237a-402c-b339-1bb87853d797\") " Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.988887 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs" (OuterVolumeSpecName: "logs") pod "751b5095-237a-402c-b339-1bb87853d797" (UID: "751b5095-237a-402c-b339-1bb87853d797"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.989044 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751b5095-237a-402c-b339-1bb87853d797-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:12 crc kubenswrapper[4565]: I1125 09:21:12.992459 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq" (OuterVolumeSpecName: "kube-api-access-6hhvq") pod "751b5095-237a-402c-b339-1bb87853d797" (UID: "751b5095-237a-402c-b339-1bb87853d797"). InnerVolumeSpecName "kube-api-access-6hhvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.013004 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "751b5095-237a-402c-b339-1bb87853d797" (UID: "751b5095-237a-402c-b339-1bb87853d797"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.017245 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data" (OuterVolumeSpecName: "config-data") pod "751b5095-237a-402c-b339-1bb87853d797" (UID: "751b5095-237a-402c-b339-1bb87853d797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.020015 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.021112 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27dea053-ce0e-4727-a310-785ccfde4424","Type":"ContainerDied","Data":"9f3ef7cadee617692425dbe537942d8b6f4f7aa76d7e4ceac6a8ab900076a97f"} Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.021450 4565 scope.go:117] "RemoveContainer" containerID="c42a05ca46ec599c0c8976cee755ac8751f863590afac854c5be59addf6ee659" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.032349 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerStarted","Data":"ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8"} Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.032429 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.041252 4565 generic.go:334] "Generic (PLEG): container finished" podID="751b5095-237a-402c-b339-1bb87853d797" containerID="8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4" exitCode=0 Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.041296 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerDied","Data":"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4"} Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.041324 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.041339 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"751b5095-237a-402c-b339-1bb87853d797","Type":"ContainerDied","Data":"5fae5afc9642f1207ce5e9eec9bdbbd2e0c042962e949d8c262ef7171f72747a"} Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.049109 4565 scope.go:117] "RemoveContainer" containerID="8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.061003 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.747088356 podStartE2EDuration="2.060984858s" podCreationTimestamp="2025-11-25 09:21:11 +0000 UTC" firstStartedPulling="2025-11-25 09:21:11.831668198 +0000 UTC m=+1005.034163336" lastFinishedPulling="2025-11-25 09:21:12.1455647 +0000 UTC m=+1005.348059838" observedRunningTime="2025-11-25 09:21:13.054729306 +0000 UTC m=+1006.257224445" watchObservedRunningTime="2025-11-25 09:21:13.060984858 +0000 UTC m=+1006.263479996" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.075490 4565 scope.go:117] "RemoveContainer" containerID="c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.082086 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.090823 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.092247 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.092369 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751b5095-237a-402c-b339-1bb87853d797-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.092442 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hhvq\" (UniqueName: \"kubernetes.io/projected/751b5095-237a-402c-b339-1bb87853d797-kube-api-access-6hhvq\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099072 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: E1125 09:21:13.099554 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-log" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099576 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-log" Nov 25 09:21:13 crc kubenswrapper[4565]: E1125 09:21:13.099604 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-api" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099610 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-api" Nov 25 09:21:13 crc kubenswrapper[4565]: E1125 09:21:13.099631 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dea053-ce0e-4727-a310-785ccfde4424" containerName="nova-scheduler-scheduler" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099638 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dea053-ce0e-4727-a310-785ccfde4424" containerName="nova-scheduler-scheduler" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099861 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-api" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099886 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="751b5095-237a-402c-b339-1bb87853d797" containerName="nova-api-log" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.099895 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="27dea053-ce0e-4727-a310-785ccfde4424" containerName="nova-scheduler-scheduler" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.101149 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.102055 4565 scope.go:117] "RemoveContainer" containerID="8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4" Nov 25 09:21:13 crc kubenswrapper[4565]: E1125 09:21:13.102454 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4\": container with ID starting with 8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4 not found: ID does not exist" containerID="8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.102489 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4"} err="failed to get container status \"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4\": rpc error: code = NotFound desc = could not find container \"8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4\": container with ID starting with 8ddda8e1b1c1cae4b0f5c498c90c3ae945b2191184b06d2a842f0f97c22af5e4 not found: ID does not exist" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.102515 4565 scope.go:117] "RemoveContainer" containerID="c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.105690 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 09:21:13 crc kubenswrapper[4565]: E1125 09:21:13.106196 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e\": container with ID starting with c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e not found: ID does not exist" containerID="c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.106227 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e"} err="failed to get container status \"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e\": rpc error: code = NotFound desc = could not find container \"c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e\": container with ID starting with c128fc740dcb2d3cb3b38bed69e5f9914a30a200b00d2e9defb8b7896d27573e not found: ID does not exist" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.122419 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27dea053-ce0e-4727-a310-785ccfde4424" path="/var/lib/kubelet/pods/27dea053-ce0e-4727-a310-785ccfde4424/volumes" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.123181 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.123211 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.123227 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.157403 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.159013 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.170525 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202376 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202479 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202513 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202615 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5z2s\" (UniqueName: \"kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202718 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76t7m\" (UniqueName: \"kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202756 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.202780 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.244529 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304322 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76t7m\" (UniqueName: \"kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304379 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304414 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304492 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304554 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304581 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.304686 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5z2s\" (UniqueName: \"kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.305172 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.311421 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.312068 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.314553 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.318493 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.324399 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5z2s\" (UniqueName: \"kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s\") pod \"nova-scheduler-0\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.333901 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76t7m\" (UniqueName: \"kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m\") pod \"nova-api-0\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.424191 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.481049 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:13 crc kubenswrapper[4565]: I1125 09:21:13.874835 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.006812 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:14 crc kubenswrapper[4565]: W1125 09:21:14.009847 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988505de_ea56_408e_8a9a_0847baf786b9.slice/crio-49340d19de66ee677e945d68f4a2e7b596636e2a98044e4f072f1fb2d2291a64 WatchSource:0}: Error finding container 49340d19de66ee677e945d68f4a2e7b596636e2a98044e4f072f1fb2d2291a64: Status 404 returned error can't find the container with id 49340d19de66ee677e945d68f4a2e7b596636e2a98044e4f072f1fb2d2291a64 Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.059356 4565 generic.go:334] "Generic (PLEG): container finished" podID="93717c97-0833-46cd-bb1b-062e65667195" containerID="8ef703925cb9fcec1e23636e3acab3202d372199a5892e393673dbb0d6e47e4f" exitCode=0 Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.059424 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerDied","Data":"8ef703925cb9fcec1e23636e3acab3202d372199a5892e393673dbb0d6e47e4f"} Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.061451 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerStarted","Data":"49340d19de66ee677e945d68f4a2e7b596636e2a98044e4f072f1fb2d2291a64"} Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.064431 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd","Type":"ContainerStarted","Data":"d0980a2bbfef7f28c204ac1a31675d9aebdb1f5cd95791865713925b40c7bb05"} Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.189911 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.336057 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.336452 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.336537 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.337662 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.337710 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.337740 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djgn4\" (UniqueName: \"kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.337842 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts\") pod \"93717c97-0833-46cd-bb1b-062e65667195\" (UID: \"93717c97-0833-46cd-bb1b-062e65667195\") " Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.338504 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.338805 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.341018 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.341044 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93717c97-0833-46cd-bb1b-062e65667195-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.341501 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts" (OuterVolumeSpecName: "scripts") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.343766 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4" (OuterVolumeSpecName: "kube-api-access-djgn4") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "kube-api-access-djgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.374139 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.415302 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.437689 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data" (OuterVolumeSpecName: "config-data") pod "93717c97-0833-46cd-bb1b-062e65667195" (UID: "93717c97-0833-46cd-bb1b-062e65667195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.443497 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djgn4\" (UniqueName: \"kubernetes.io/projected/93717c97-0833-46cd-bb1b-062e65667195-kube-api-access-djgn4\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.443605 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.443686 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.443751 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:14 crc kubenswrapper[4565]: I1125 09:21:14.443807 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93717c97-0833-46cd-bb1b-062e65667195-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.075663 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd","Type":"ContainerStarted","Data":"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646"} Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.079178 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93717c97-0833-46cd-bb1b-062e65667195","Type":"ContainerDied","Data":"5a354569d005a5f39c4fd2e984e148fbd4d6b800cdbd9ff090757686038a3582"} Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.079223 4565 scope.go:117] "RemoveContainer" containerID="c3ef2f64fe30e87e03aa02c99a2f75a529e3e0176c45204d950ec0ae4f985121" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.079368 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.087156 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerStarted","Data":"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b"} Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.087209 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerStarted","Data":"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a"} Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.110139 4565 scope.go:117] "RemoveContainer" containerID="12b3252c0abe9e2d166361fd651af8edc3678b9b09e2943260d4f756d6acf679" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.113312 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.113291176 podStartE2EDuration="2.113291176s" podCreationTimestamp="2025-11-25 09:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:15.109519336 +0000 UTC m=+1008.312014474" watchObservedRunningTime="2025-11-25 09:21:15.113291176 +0000 UTC m=+1008.315786313" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.118561 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751b5095-237a-402c-b339-1bb87853d797" path="/var/lib/kubelet/pods/751b5095-237a-402c-b339-1bb87853d797/volumes" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.129036 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.135878 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150133 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:15 crc kubenswrapper[4565]: E1125 09:21:15.150470 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-notification-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150489 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-notification-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: E1125 09:21:15.150501 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="proxy-httpd" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150508 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="proxy-httpd" Nov 25 09:21:15 crc kubenswrapper[4565]: E1125 09:21:15.150544 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-central-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150550 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-central-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: E1125 09:21:15.150559 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="sg-core" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150564 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="sg-core" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150725 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-notification-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150745 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="ceilometer-central-agent" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150753 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="sg-core" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.150760 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="93717c97-0833-46cd-bb1b-062e65667195" containerName="proxy-httpd" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.152210 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.152214 4565 scope.go:117] "RemoveContainer" containerID="8ef703925cb9fcec1e23636e3acab3202d372199a5892e393673dbb0d6e47e4f" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.161426 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.164903 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.164886713 podStartE2EDuration="2.164886713s" podCreationTimestamp="2025-11-25 09:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:15.152659545 +0000 UTC m=+1008.355154682" watchObservedRunningTime="2025-11-25 09:21:15.164886713 +0000 UTC m=+1008.367381840" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.165202 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.165466 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.165587 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.198493 4565 scope.go:117] "RemoveContainer" containerID="6d82cf82b26a114380674af7822fb6d734b44eb4ec562cfc9cfbdca55dd3c85b" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.257953 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258000 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258021 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258081 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87vd\" (UniqueName: \"kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258312 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258428 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.258586 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361087 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361147 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361177 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361214 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361236 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87vd\" (UniqueName: \"kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361352 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361427 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361548 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.361908 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.362073 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.366179 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.366360 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.367357 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.374180 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.375382 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87vd\" (UniqueName: \"kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.376685 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts\") pod \"ceilometer-0\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.469299 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:15 crc kubenswrapper[4565]: I1125 09:21:15.909368 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:16 crc kubenswrapper[4565]: I1125 09:21:16.101143 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerStarted","Data":"eb1c82ed4afc457cf9da2980a97e6d357a468be62b0b306cdddf06db8d88e37e"} Nov 25 09:21:17 crc kubenswrapper[4565]: I1125 09:21:17.135312 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93717c97-0833-46cd-bb1b-062e65667195" path="/var/lib/kubelet/pods/93717c97-0833-46cd-bb1b-062e65667195/volumes" Nov 25 09:21:17 crc kubenswrapper[4565]: I1125 09:21:17.162088 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerStarted","Data":"19f72a50b9b576f39907c876807fecba4751ac0644a007273d2e24e0e1c0f033"} Nov 25 09:21:17 crc kubenswrapper[4565]: I1125 09:21:17.686726 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 09:21:17 crc kubenswrapper[4565]: I1125 09:21:17.689457 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 09:21:18 crc kubenswrapper[4565]: I1125 09:21:18.168632 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerStarted","Data":"85a0827ef800cdc0fe991b4dfa8c1d5c42d4947a863adcc42407ca0278451548"} Nov 25 09:21:18 crc kubenswrapper[4565]: I1125 09:21:18.169791 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerStarted","Data":"af0f72bc15a2d6351977a9df9850513b7af39c7c38bcc885552231ad52a9b541"} Nov 25 09:21:18 crc kubenswrapper[4565]: I1125 09:21:18.425603 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 09:21:18 crc kubenswrapper[4565]: I1125 09:21:18.701272 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 09:21:18 crc kubenswrapper[4565]: I1125 09:21:18.701583 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 09:21:20 crc kubenswrapper[4565]: I1125 09:21:20.183599 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerStarted","Data":"546923c5b6d79484e19f97091dd1034e786aa193cf0c282c33aa92da65d880cf"} Nov 25 09:21:20 crc kubenswrapper[4565]: I1125 09:21:20.184011 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:21:20 crc kubenswrapper[4565]: I1125 09:21:20.210801 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8610911319999999 podStartE2EDuration="5.210783569s" podCreationTimestamp="2025-11-25 09:21:15 +0000 UTC" firstStartedPulling="2025-11-25 09:21:15.915968494 +0000 UTC m=+1009.118463632" lastFinishedPulling="2025-11-25 09:21:19.265660931 +0000 UTC m=+1012.468156069" observedRunningTime="2025-11-25 09:21:20.209120163 +0000 UTC m=+1013.411615301" watchObservedRunningTime="2025-11-25 09:21:20.210783569 +0000 UTC m=+1013.413278707" Nov 25 09:21:21 crc kubenswrapper[4565]: I1125 09:21:21.378650 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 09:21:23 crc kubenswrapper[4565]: I1125 09:21:23.425801 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 09:21:23 crc kubenswrapper[4565]: I1125 09:21:23.451012 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 09:21:23 crc kubenswrapper[4565]: I1125 09:21:23.481361 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:21:23 crc kubenswrapper[4565]: I1125 09:21:23.481422 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:21:24 crc kubenswrapper[4565]: I1125 09:21:24.255598 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 09:21:24 crc kubenswrapper[4565]: I1125 09:21:24.564092 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 09:21:24 crc kubenswrapper[4565]: I1125 09:21:24.564106 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 09:21:27 crc kubenswrapper[4565]: I1125 09:21:27.690094 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 09:21:27 crc kubenswrapper[4565]: I1125 09:21:27.692216 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 09:21:27 crc kubenswrapper[4565]: I1125 09:21:27.695541 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 09:21:28 crc kubenswrapper[4565]: I1125 09:21:28.269162 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.126043 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.189142 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data\") pod \"06be536c-6234-4539-ae36-7bd60a6d2097\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.189382 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh468\" (UniqueName: \"kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468\") pod \"06be536c-6234-4539-ae36-7bd60a6d2097\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.196778 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468" (OuterVolumeSpecName: "kube-api-access-wh468") pod "06be536c-6234-4539-ae36-7bd60a6d2097" (UID: "06be536c-6234-4539-ae36-7bd60a6d2097"). InnerVolumeSpecName "kube-api-access-wh468". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.214439 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data" (OuterVolumeSpecName: "config-data") pod "06be536c-6234-4539-ae36-7bd60a6d2097" (UID: "06be536c-6234-4539-ae36-7bd60a6d2097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.291664 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle\") pod \"06be536c-6234-4539-ae36-7bd60a6d2097\" (UID: \"06be536c-6234-4539-ae36-7bd60a6d2097\") " Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.293950 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.294001 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh468\" (UniqueName: \"kubernetes.io/projected/06be536c-6234-4539-ae36-7bd60a6d2097-kube-api-access-wh468\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.298699 4565 generic.go:334] "Generic (PLEG): container finished" podID="06be536c-6234-4539-ae36-7bd60a6d2097" containerID="97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784" exitCode=137 Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.298785 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"06be536c-6234-4539-ae36-7bd60a6d2097","Type":"ContainerDied","Data":"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784"} Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.298814 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.298861 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"06be536c-6234-4539-ae36-7bd60a6d2097","Type":"ContainerDied","Data":"9e9e1f89beb01fe35ccfdf55fec17cbe7861c026c860d3b7f5046a8f04d7a148"} Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.298888 4565 scope.go:117] "RemoveContainer" containerID="97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.320625 4565 scope.go:117] "RemoveContainer" containerID="97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784" Nov 25 09:21:31 crc kubenswrapper[4565]: E1125 09:21:31.321149 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784\": container with ID starting with 97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784 not found: ID does not exist" containerID="97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.321186 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784"} err="failed to get container status \"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784\": rpc error: code = NotFound desc = could not find container \"97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784\": container with ID starting with 97859a3cab1813b3135332d25057b42b373928deb71f6f581ae5caac1b4a4784 not found: ID does not exist" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.322819 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06be536c-6234-4539-ae36-7bd60a6d2097" (UID: "06be536c-6234-4539-ae36-7bd60a6d2097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.395708 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06be536c-6234-4539-ae36-7bd60a6d2097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.632361 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.647706 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.655593 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:21:31 crc kubenswrapper[4565]: E1125 09:21:31.656043 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06be536c-6234-4539-ae36-7bd60a6d2097" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.656120 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="06be536c-6234-4539-ae36-7bd60a6d2097" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.656406 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="06be536c-6234-4539-ae36-7bd60a6d2097" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.657116 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.660811 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.661014 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.662544 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.667273 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.806221 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.806295 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzsd\" (UniqueName: \"kubernetes.io/projected/42ef1e95-a44e-4dea-8127-228bd8065e0c-kube-api-access-gvzsd\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.806437 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.806487 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.806609 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.909087 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.909171 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.909229 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.909287 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.909333 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzsd\" (UniqueName: \"kubernetes.io/projected/42ef1e95-a44e-4dea-8127-228bd8065e0c-kube-api-access-gvzsd\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.916210 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.916458 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.917310 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.919229 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ef1e95-a44e-4dea-8127-228bd8065e0c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.924723 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzsd\" (UniqueName: \"kubernetes.io/projected/42ef1e95-a44e-4dea-8127-228bd8065e0c-kube-api-access-gvzsd\") pod \"nova-cell1-novncproxy-0\" (UID: \"42ef1e95-a44e-4dea-8127-228bd8065e0c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:31 crc kubenswrapper[4565]: I1125 09:21:31.982656 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:32 crc kubenswrapper[4565]: I1125 09:21:32.387007 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 09:21:32 crc kubenswrapper[4565]: W1125 09:21:32.389739 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42ef1e95_a44e_4dea_8127_228bd8065e0c.slice/crio-344995dd354dacb8dbeab1df37a9e3fc3f2d4857679a892b93cc3c1d19b4d451 WatchSource:0}: Error finding container 344995dd354dacb8dbeab1df37a9e3fc3f2d4857679a892b93cc3c1d19b4d451: Status 404 returned error can't find the container with id 344995dd354dacb8dbeab1df37a9e3fc3f2d4857679a892b93cc3c1d19b4d451 Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.108514 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06be536c-6234-4539-ae36-7bd60a6d2097" path="/var/lib/kubelet/pods/06be536c-6234-4539-ae36-7bd60a6d2097/volumes" Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.321469 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"42ef1e95-a44e-4dea-8127-228bd8065e0c","Type":"ContainerStarted","Data":"725f87e3e908714d6c1acc94a2daacb188de079a5e597c6065591d57207bd5ce"} Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.321526 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"42ef1e95-a44e-4dea-8127-228bd8065e0c","Type":"ContainerStarted","Data":"344995dd354dacb8dbeab1df37a9e3fc3f2d4857679a892b93cc3c1d19b4d451"} Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.369415 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.369392289 podStartE2EDuration="2.369392289s" podCreationTimestamp="2025-11-25 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:33.346597771 +0000 UTC m=+1026.549092909" watchObservedRunningTime="2025-11-25 09:21:33.369392289 +0000 UTC m=+1026.571887427" Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.487619 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.488490 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.489952 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 09:21:33 crc kubenswrapper[4565]: I1125 09:21:33.490084 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.330436 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.334562 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.508150 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.509526 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.520165 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.575402 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx98n\" (UniqueName: \"kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.575497 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.575537 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.575571 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.575599 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.678298 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx98n\" (UniqueName: \"kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.678447 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.678504 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.678558 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.678595 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.679779 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.679778 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.679900 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.680071 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.705105 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx98n\" (UniqueName: \"kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n\") pod \"dnsmasq-dns-c9b558957-9nj2j\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:34 crc kubenswrapper[4565]: I1125 09:21:34.826454 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:35 crc kubenswrapper[4565]: W1125 09:21:35.296558 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b02475b_df47_4dcd_b04b_4f3294a87d56.slice/crio-d825c291f039b63c053fcbc5485b06c4c5b95d2c34b9ffe9ffe6f74792147496 WatchSource:0}: Error finding container d825c291f039b63c053fcbc5485b06c4c5b95d2c34b9ffe9ffe6f74792147496: Status 404 returned error can't find the container with id d825c291f039b63c053fcbc5485b06c4c5b95d2c34b9ffe9ffe6f74792147496 Nov 25 09:21:35 crc kubenswrapper[4565]: I1125 09:21:35.298632 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:21:35 crc kubenswrapper[4565]: I1125 09:21:35.347652 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" event={"ID":"6b02475b-df47-4dcd-b04b-4f3294a87d56","Type":"ContainerStarted","Data":"d825c291f039b63c053fcbc5485b06c4c5b95d2c34b9ffe9ffe6f74792147496"} Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.362655 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerID="f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78" exitCode=0 Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.362711 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" event={"ID":"6b02475b-df47-4dcd-b04b-4f3294a87d56","Type":"ContainerDied","Data":"f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78"} Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.673133 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.673646 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-central-agent" containerID="cri-o://19f72a50b9b576f39907c876807fecba4751ac0644a007273d2e24e0e1c0f033" gracePeriod=30 Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.673717 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="sg-core" containerID="cri-o://85a0827ef800cdc0fe991b4dfa8c1d5c42d4947a863adcc42407ca0278451548" gracePeriod=30 Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.673763 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-notification-agent" containerID="cri-o://af0f72bc15a2d6351977a9df9850513b7af39c7c38bcc885552231ad52a9b541" gracePeriod=30 Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.673824 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="proxy-httpd" containerID="cri-o://546923c5b6d79484e19f97091dd1034e786aa193cf0c282c33aa92da65d880cf" gracePeriod=30 Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.684867 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.181:3000/\": EOF" Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.735415 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:36 crc kubenswrapper[4565]: I1125 09:21:36.983658 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.375776 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" event={"ID":"6b02475b-df47-4dcd-b04b-4f3294a87d56","Type":"ContainerStarted","Data":"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c"} Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.375970 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379071 4565 generic.go:334] "Generic (PLEG): container finished" podID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerID="546923c5b6d79484e19f97091dd1034e786aa193cf0c282c33aa92da65d880cf" exitCode=0 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379095 4565 generic.go:334] "Generic (PLEG): container finished" podID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerID="85a0827ef800cdc0fe991b4dfa8c1d5c42d4947a863adcc42407ca0278451548" exitCode=2 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379113 4565 generic.go:334] "Generic (PLEG): container finished" podID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerID="af0f72bc15a2d6351977a9df9850513b7af39c7c38bcc885552231ad52a9b541" exitCode=0 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379120 4565 generic.go:334] "Generic (PLEG): container finished" podID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerID="19f72a50b9b576f39907c876807fecba4751ac0644a007273d2e24e0e1c0f033" exitCode=0 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379118 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerDied","Data":"546923c5b6d79484e19f97091dd1034e786aa193cf0c282c33aa92da65d880cf"} Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379171 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerDied","Data":"85a0827ef800cdc0fe991b4dfa8c1d5c42d4947a863adcc42407ca0278451548"} Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379183 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerDied","Data":"af0f72bc15a2d6351977a9df9850513b7af39c7c38bcc885552231ad52a9b541"} Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379195 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerDied","Data":"19f72a50b9b576f39907c876807fecba4751ac0644a007273d2e24e0e1c0f033"} Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379291 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-log" containerID="cri-o://fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a" gracePeriod=30 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.379386 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-api" containerID="cri-o://2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b" gracePeriod=30 Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.395589 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" podStartSLOduration=3.395569914 podStartE2EDuration="3.395569914s" podCreationTimestamp="2025-11-25 09:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:37.38924431 +0000 UTC m=+1030.591739448" watchObservedRunningTime="2025-11-25 09:21:37.395569914 +0000 UTC m=+1030.598065052" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.569002 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.744812 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.744867 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.745114 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.745147 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.745241 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.745416 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.745764 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.746036 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.746091 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.746173 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j87vd\" (UniqueName: \"kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd\") pod \"6445290e-9596-4feb-aaff-2d6e68bf6d70\" (UID: \"6445290e-9596-4feb-aaff-2d6e68bf6d70\") " Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.747496 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.747520 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6445290e-9596-4feb-aaff-2d6e68bf6d70-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.753456 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd" (OuterVolumeSpecName: "kube-api-access-j87vd") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "kube-api-access-j87vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.763381 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts" (OuterVolumeSpecName: "scripts") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.784058 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.825991 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.849371 4565 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.849399 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.849412 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j87vd\" (UniqueName: \"kubernetes.io/projected/6445290e-9596-4feb-aaff-2d6e68bf6d70-kube-api-access-j87vd\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.849423 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.881099 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.881304 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data" (OuterVolumeSpecName: "config-data") pod "6445290e-9596-4feb-aaff-2d6e68bf6d70" (UID: "6445290e-9596-4feb-aaff-2d6e68bf6d70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.951560 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:37 crc kubenswrapper[4565]: I1125 09:21:37.951591 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6445290e-9596-4feb-aaff-2d6e68bf6d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.394239 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6445290e-9596-4feb-aaff-2d6e68bf6d70","Type":"ContainerDied","Data":"eb1c82ed4afc457cf9da2980a97e6d357a468be62b0b306cdddf06db8d88e37e"} Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.394316 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.394340 4565 scope.go:117] "RemoveContainer" containerID="546923c5b6d79484e19f97091dd1034e786aa193cf0c282c33aa92da65d880cf" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.397779 4565 generic.go:334] "Generic (PLEG): container finished" podID="988505de-ea56-408e-8a9a-0847baf786b9" containerID="fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a" exitCode=143 Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.397957 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerDied","Data":"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a"} Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.426908 4565 scope.go:117] "RemoveContainer" containerID="85a0827ef800cdc0fe991b4dfa8c1d5c42d4947a863adcc42407ca0278451548" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.432771 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.459601 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.464789 4565 scope.go:117] "RemoveContainer" containerID="af0f72bc15a2d6351977a9df9850513b7af39c7c38bcc885552231ad52a9b541" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.469370 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:38 crc kubenswrapper[4565]: E1125 09:21:38.469804 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-notification-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.469826 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-notification-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: E1125 09:21:38.469839 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-central-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.469845 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-central-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: E1125 09:21:38.469861 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="proxy-httpd" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.469868 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="proxy-httpd" Nov 25 09:21:38 crc kubenswrapper[4565]: E1125 09:21:38.469891 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="sg-core" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.469897 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="sg-core" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.470186 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="proxy-httpd" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.470205 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-notification-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.470226 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="ceilometer-central-agent" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.470234 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" containerName="sg-core" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.482958 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.488097 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.488216 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.488558 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.488907 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.490420 4565 scope.go:117] "RemoveContainer" containerID="19f72a50b9b576f39907c876807fecba4751ac0644a007273d2e24e0e1c0f033" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.569534 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.569868 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766tm\" (UniqueName: \"kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.569973 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.570242 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.570350 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.570446 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.570702 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.570796 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.672683 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.672838 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.672906 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.672997 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.673049 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.673074 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.673100 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.673208 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766tm\" (UniqueName: \"kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.678535 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.683483 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.683748 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.684091 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.685854 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.686478 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.688979 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.692368 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766tm\" (UniqueName: \"kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm\") pod \"ceilometer-0\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " pod="openstack/ceilometer-0" Nov 25 09:21:38 crc kubenswrapper[4565]: I1125 09:21:38.799028 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:21:39 crc kubenswrapper[4565]: I1125 09:21:39.110453 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6445290e-9596-4feb-aaff-2d6e68bf6d70" path="/var/lib/kubelet/pods/6445290e-9596-4feb-aaff-2d6e68bf6d70/volumes" Nov 25 09:21:39 crc kubenswrapper[4565]: I1125 09:21:39.223228 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:21:39 crc kubenswrapper[4565]: I1125 09:21:39.408442 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerStarted","Data":"4ca76214db06ab128ac41fb62f1c98736f2be39f46f9b15c2548e257f8317e8d"} Nov 25 09:21:40 crc kubenswrapper[4565]: I1125 09:21:40.420872 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerStarted","Data":"dca4219f4122ad019aeb2b0488da9b72fd339aae9dfa8951172c224aa8d24909"} Nov 25 09:21:40 crc kubenswrapper[4565]: I1125 09:21:40.876543 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.020009 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs\") pod \"988505de-ea56-408e-8a9a-0847baf786b9\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.020081 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data\") pod \"988505de-ea56-408e-8a9a-0847baf786b9\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.020197 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle\") pod \"988505de-ea56-408e-8a9a-0847baf786b9\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.020260 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76t7m\" (UniqueName: \"kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m\") pod \"988505de-ea56-408e-8a9a-0847baf786b9\" (UID: \"988505de-ea56-408e-8a9a-0847baf786b9\") " Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.021055 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs" (OuterVolumeSpecName: "logs") pod "988505de-ea56-408e-8a9a-0847baf786b9" (UID: "988505de-ea56-408e-8a9a-0847baf786b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.021652 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988505de-ea56-408e-8a9a-0847baf786b9-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.026798 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m" (OuterVolumeSpecName: "kube-api-access-76t7m") pod "988505de-ea56-408e-8a9a-0847baf786b9" (UID: "988505de-ea56-408e-8a9a-0847baf786b9"). InnerVolumeSpecName "kube-api-access-76t7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.054049 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "988505de-ea56-408e-8a9a-0847baf786b9" (UID: "988505de-ea56-408e-8a9a-0847baf786b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.099077 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data" (OuterVolumeSpecName: "config-data") pod "988505de-ea56-408e-8a9a-0847baf786b9" (UID: "988505de-ea56-408e-8a9a-0847baf786b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.125523 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.125564 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988505de-ea56-408e-8a9a-0847baf786b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.125577 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76t7m\" (UniqueName: \"kubernetes.io/projected/988505de-ea56-408e-8a9a-0847baf786b9-kube-api-access-76t7m\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:41 crc kubenswrapper[4565]: E1125 09:21:41.249577 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988505de_ea56_408e_8a9a_0847baf786b9.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.431393 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerStarted","Data":"6d211cf1c58730f51191a09c61542a9d6686ddcb2b9b063bcf46b4e2fd511248"} Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.433661 4565 generic.go:334] "Generic (PLEG): container finished" podID="988505de-ea56-408e-8a9a-0847baf786b9" containerID="2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b" exitCode=0 Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.433714 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerDied","Data":"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b"} Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.433749 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988505de-ea56-408e-8a9a-0847baf786b9","Type":"ContainerDied","Data":"49340d19de66ee677e945d68f4a2e7b596636e2a98044e4f072f1fb2d2291a64"} Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.433767 4565 scope.go:117] "RemoveContainer" containerID="2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.433900 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.456689 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.457440 4565 scope.go:117] "RemoveContainer" containerID="fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.475879 4565 scope.go:117] "RemoveContainer" containerID="2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b" Nov 25 09:21:41 crc kubenswrapper[4565]: E1125 09:21:41.476263 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b\": container with ID starting with 2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b not found: ID does not exist" containerID="2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.476306 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b"} err="failed to get container status \"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b\": rpc error: code = NotFound desc = could not find container \"2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b\": container with ID starting with 2bdff84d025348080081aa85586f9cc02f97e6430a31dbe712818b5f22dba58b not found: ID does not exist" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.476348 4565 scope.go:117] "RemoveContainer" containerID="fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a" Nov 25 09:21:41 crc kubenswrapper[4565]: E1125 09:21:41.476629 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a\": container with ID starting with fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a not found: ID does not exist" containerID="fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.476659 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a"} err="failed to get container status \"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a\": rpc error: code = NotFound desc = could not find container \"fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a\": container with ID starting with fc8eb65ceeab5480b1ffdf67bf6827d12c4dc2b574a4eaf1ea870fb70cca141a not found: ID does not exist" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.478100 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.483958 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:41 crc kubenswrapper[4565]: E1125 09:21:41.484358 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-log" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.484377 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-log" Nov 25 09:21:41 crc kubenswrapper[4565]: E1125 09:21:41.484386 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-api" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.484391 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-api" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.484532 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-log" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.484557 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="988505de-ea56-408e-8a9a-0847baf786b9" containerName="nova-api-api" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.485444 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.491643 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.491818 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.492093 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.494056 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636389 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636459 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwcd6\" (UniqueName: \"kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636509 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636542 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636573 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.636592 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.737989 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.738467 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwcd6\" (UniqueName: \"kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.738609 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.738701 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.738800 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.738889 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.739256 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.743291 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.748004 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.748430 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.748852 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.760078 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwcd6\" (UniqueName: \"kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6\") pod \"nova-api-0\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.802136 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:41 crc kubenswrapper[4565]: I1125 09:21:41.983760 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.014184 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.302566 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.447906 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerStarted","Data":"71af7fb071e9fcaaee174b15a4c7ddae16a00caf0ee69dbaa4ce9b2b036c0bb5"} Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.459410 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerStarted","Data":"b5b5ad7ce46bf4df020b0a3a8b9731b0be26aae3115d73b31cd301496b8bc65f"} Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.484739 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.631020 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-khsf2"] Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.632288 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.634010 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.634221 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.636230 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-khsf2"] Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.770869 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.771111 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9h8\" (UniqueName: \"kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.771207 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.771244 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.872648 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.872698 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.872818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.872917 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9h8\" (UniqueName: \"kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.876648 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.877446 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.878652 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.887719 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9h8\" (UniqueName: \"kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8\") pod \"nova-cell1-cell-mapping-khsf2\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:42 crc kubenswrapper[4565]: I1125 09:21:42.959520 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.121647 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988505de-ea56-408e-8a9a-0847baf786b9" path="/var/lib/kubelet/pods/988505de-ea56-408e-8a9a-0847baf786b9/volumes" Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.392447 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-khsf2"] Nov 25 09:21:43 crc kubenswrapper[4565]: W1125 09:21:43.394667 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc53b2bc5_cd7f_4a87_878c_ca9deec24f8b.slice/crio-eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a WatchSource:0}: Error finding container eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a: Status 404 returned error can't find the container with id eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.475103 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khsf2" event={"ID":"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b","Type":"ContainerStarted","Data":"eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a"} Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.482096 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerStarted","Data":"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe"} Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.482135 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerStarted","Data":"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5"} Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.488822 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerStarted","Data":"ae2e8815c54b843e87677bd998363eefad45104c69084dfe3b9fce2408251cd4"} Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.516692 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.516672032 podStartE2EDuration="2.516672032s" podCreationTimestamp="2025-11-25 09:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:43.513607816 +0000 UTC m=+1036.716102953" watchObservedRunningTime="2025-11-25 09:21:43.516672032 +0000 UTC m=+1036.719167170" Nov 25 09:21:43 crc kubenswrapper[4565]: I1125 09:21:43.551683 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.714650098 podStartE2EDuration="5.551672668s" podCreationTimestamp="2025-11-25 09:21:38 +0000 UTC" firstStartedPulling="2025-11-25 09:21:39.236156713 +0000 UTC m=+1032.438651840" lastFinishedPulling="2025-11-25 09:21:43.073179282 +0000 UTC m=+1036.275674410" observedRunningTime="2025-11-25 09:21:43.544285002 +0000 UTC m=+1036.746780139" watchObservedRunningTime="2025-11-25 09:21:43.551672668 +0000 UTC m=+1036.754167806" Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.500767 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khsf2" event={"ID":"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b","Type":"ContainerStarted","Data":"b4cc5ccc2dcf5d502d78a9e42f5309ed5a97af0358538892d23e6540da4b7e1b"} Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.501966 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.828098 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.850971 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-khsf2" podStartSLOduration=2.850949662 podStartE2EDuration="2.850949662s" podCreationTimestamp="2025-11-25 09:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:44.528529403 +0000 UTC m=+1037.731024541" watchObservedRunningTime="2025-11-25 09:21:44.850949662 +0000 UTC m=+1038.053444800" Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.903551 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:21:44 crc kubenswrapper[4565]: I1125 09:21:44.911195 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="dnsmasq-dns" containerID="cri-o://9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c" gracePeriod=10 Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.415032 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.451392 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twkgv\" (UniqueName: \"kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv\") pod \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.451445 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config\") pod \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.451525 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb\") pod \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.451706 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb\") pod \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.451831 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc\") pod \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\" (UID: \"46e9931e-fcd4-4c18-8c06-537d4c162c1f\") " Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.483111 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv" (OuterVolumeSpecName: "kube-api-access-twkgv") pod "46e9931e-fcd4-4c18-8c06-537d4c162c1f" (UID: "46e9931e-fcd4-4c18-8c06-537d4c162c1f"). InnerVolumeSpecName "kube-api-access-twkgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.513207 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46e9931e-fcd4-4c18-8c06-537d4c162c1f" (UID: "46e9931e-fcd4-4c18-8c06-537d4c162c1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.517540 4565 generic.go:334] "Generic (PLEG): container finished" podID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerID="9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c" exitCode=0 Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.518021 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" event={"ID":"46e9931e-fcd4-4c18-8c06-537d4c162c1f","Type":"ContainerDied","Data":"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c"} Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.518076 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" event={"ID":"46e9931e-fcd4-4c18-8c06-537d4c162c1f","Type":"ContainerDied","Data":"0baf60a0c34977bcd007dc5f22242f78959fdbe5bb5a275b90a2de09292c82ca"} Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.518097 4565 scope.go:117] "RemoveContainer" containerID="9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.518538 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.526916 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config" (OuterVolumeSpecName: "config") pod "46e9931e-fcd4-4c18-8c06-537d4c162c1f" (UID: "46e9931e-fcd4-4c18-8c06-537d4c162c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.546695 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46e9931e-fcd4-4c18-8c06-537d4c162c1f" (UID: "46e9931e-fcd4-4c18-8c06-537d4c162c1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.548852 4565 scope.go:117] "RemoveContainer" containerID="19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.553426 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.553453 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twkgv\" (UniqueName: \"kubernetes.io/projected/46e9931e-fcd4-4c18-8c06-537d4c162c1f-kube-api-access-twkgv\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.553464 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.553455 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46e9931e-fcd4-4c18-8c06-537d4c162c1f" (UID: "46e9931e-fcd4-4c18-8c06-537d4c162c1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.553472 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.569716 4565 scope.go:117] "RemoveContainer" containerID="9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c" Nov 25 09:21:45 crc kubenswrapper[4565]: E1125 09:21:45.570212 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c\": container with ID starting with 9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c not found: ID does not exist" containerID="9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.570252 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c"} err="failed to get container status \"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c\": rpc error: code = NotFound desc = could not find container \"9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c\": container with ID starting with 9d7c7339722350082152669e174f85cfce99c05930a968e99cf081c699ea685c not found: ID does not exist" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.570281 4565 scope.go:117] "RemoveContainer" containerID="19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971" Nov 25 09:21:45 crc kubenswrapper[4565]: E1125 09:21:45.570631 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971\": container with ID starting with 19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971 not found: ID does not exist" containerID="19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.570685 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971"} err="failed to get container status \"19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971\": rpc error: code = NotFound desc = could not find container \"19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971\": container with ID starting with 19bcefb80c4f2d1a52e3910027bb93ee5ac61fef22ba117ba5d92204e1aee971 not found: ID does not exist" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.655384 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46e9931e-fcd4-4c18-8c06-537d4c162c1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.863973 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:21:45 crc kubenswrapper[4565]: I1125 09:21:45.870003 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69494d9f89-7hsws"] Nov 25 09:21:47 crc kubenswrapper[4565]: I1125 09:21:47.117821 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" path="/var/lib/kubelet/pods/46e9931e-fcd4-4c18-8c06-537d4c162c1f/volumes" Nov 25 09:21:48 crc kubenswrapper[4565]: I1125 09:21:48.544709 4565 generic.go:334] "Generic (PLEG): container finished" podID="c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" containerID="b4cc5ccc2dcf5d502d78a9e42f5309ed5a97af0358538892d23e6540da4b7e1b" exitCode=0 Nov 25 09:21:48 crc kubenswrapper[4565]: I1125 09:21:48.544757 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khsf2" event={"ID":"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b","Type":"ContainerDied","Data":"b4cc5ccc2dcf5d502d78a9e42f5309ed5a97af0358538892d23e6540da4b7e1b"} Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.807759 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.848613 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9h8\" (UniqueName: \"kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8\") pod \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.848860 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data\") pod \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.849008 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle\") pod \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.849068 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts\") pod \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\" (UID: \"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b\") " Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.868905 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts" (OuterVolumeSpecName: "scripts") pod "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" (UID: "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.872796 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" (UID: "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.875542 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8" (OuterVolumeSpecName: "kube-api-access-7s9h8") pod "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" (UID: "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b"). InnerVolumeSpecName "kube-api-access-7s9h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.880166 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data" (OuterVolumeSpecName: "config-data") pod "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" (UID: "c53b2bc5-cd7f-4a87-878c-ca9deec24f8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.952125 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.952184 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.952196 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9h8\" (UniqueName: \"kubernetes.io/projected/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-kube-api-access-7s9h8\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:49 crc kubenswrapper[4565]: I1125 09:21:49.952211 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.386643 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69494d9f89-7hsws" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: i/o timeout" Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.566210 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khsf2" event={"ID":"c53b2bc5-cd7f-4a87-878c-ca9deec24f8b","Type":"ContainerDied","Data":"eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a"} Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.566271 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab0acfa1acd58981a61b7db1d3c9dd78d75974f32f3258658c89314a2410d2a" Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.566272 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khsf2" Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.754919 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.755239 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-log" containerID="cri-o://a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" gracePeriod=30 Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.755294 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-api" containerID="cri-o://40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" gracePeriod=30 Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.769497 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.769664 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" containerName="nova-scheduler-scheduler" containerID="cri-o://3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646" gracePeriod=30 Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.792657 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.792882 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" containerID="cri-o://a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0" gracePeriod=30 Nov 25 09:21:50 crc kubenswrapper[4565]: I1125 09:21:50.793191 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" containerID="cri-o://e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97" gracePeriod=30 Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.299775 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482358 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482452 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482603 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482665 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482761 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwcd6\" (UniqueName: \"kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.482856 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data\") pod \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\" (UID: \"fc7d25a8-fd84-48aa-95d5-4299451f2fc7\") " Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.485238 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs" (OuterVolumeSpecName: "logs") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.491148 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6" (OuterVolumeSpecName: "kube-api-access-jwcd6") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "kube-api-access-jwcd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.513056 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.519603 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data" (OuterVolumeSpecName: "config-data") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.532682 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.543128 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc7d25a8-fd84-48aa-95d5-4299451f2fc7" (UID: "fc7d25a8-fd84-48aa-95d5-4299451f2fc7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.575268 4565 generic.go:334] "Generic (PLEG): container finished" podID="5357da13-b37d-4661-8e17-45ddaf365687" containerID="a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0" exitCode=143 Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.575358 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerDied","Data":"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0"} Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.577700 4565 generic.go:334] "Generic (PLEG): container finished" podID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerID="40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" exitCode=0 Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.577737 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerDied","Data":"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe"} Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.577787 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.577797 4565 scope.go:117] "RemoveContainer" containerID="40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.577975 4565 generic.go:334] "Generic (PLEG): container finished" podID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerID="a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" exitCode=143 Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.578003 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerDied","Data":"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5"} Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.579224 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc7d25a8-fd84-48aa-95d5-4299451f2fc7","Type":"ContainerDied","Data":"71af7fb071e9fcaaee174b15a4c7ddae16a00caf0ee69dbaa4ce9b2b036c0bb5"} Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585655 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585683 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585697 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585706 4565 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585715 4565 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.585723 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwcd6\" (UniqueName: \"kubernetes.io/projected/fc7d25a8-fd84-48aa-95d5-4299451f2fc7-kube-api-access-jwcd6\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.603364 4565 scope.go:117] "RemoveContainer" containerID="a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.611320 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.616688 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.622308 4565 scope.go:117] "RemoveContainer" containerID="40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.625401 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe\": container with ID starting with 40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe not found: ID does not exist" containerID="40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.625569 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe"} err="failed to get container status \"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe\": rpc error: code = NotFound desc = could not find container \"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe\": container with ID starting with 40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe not found: ID does not exist" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.625669 4565 scope.go:117] "RemoveContainer" containerID="a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.626059 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5\": container with ID starting with a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5 not found: ID does not exist" containerID="a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.626139 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5"} err="failed to get container status \"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5\": rpc error: code = NotFound desc = could not find container \"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5\": container with ID starting with a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5 not found: ID does not exist" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.626207 4565 scope.go:117] "RemoveContainer" containerID="40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.626614 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe"} err="failed to get container status \"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe\": rpc error: code = NotFound desc = could not find container \"40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe\": container with ID starting with 40cd4de375d2c64b3c61f49308748c991d090f590b242577dc1eefb9ea41dbfe not found: ID does not exist" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.626653 4565 scope.go:117] "RemoveContainer" containerID="a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.626992 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5"} err="failed to get container status \"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5\": rpc error: code = NotFound desc = could not find container \"a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5\": container with ID starting with a129c18aec59ac257f15545c77b78edd8e3995c607c9d3a5f4a11fe641c814c5 not found: ID does not exist" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.638047 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.638586 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="dnsmasq-dns" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.638670 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="dnsmasq-dns" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.638762 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-log" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.638831 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-log" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.638911 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-api" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.638992 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-api" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.639078 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" containerName="nova-manage" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639126 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" containerName="nova-manage" Nov 25 09:21:51 crc kubenswrapper[4565]: E1125 09:21:51.639229 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="init" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639278 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="init" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639602 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-api" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639678 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" containerName="nova-api-log" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639744 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" containerName="nova-manage" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.639818 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e9931e-fcd4-4c18-8c06-537d4c162c1f" containerName="dnsmasq-dns" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.640784 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.648239 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.648279 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.648419 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.659582 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.789270 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-public-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.790108 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbhq\" (UniqueName: \"kubernetes.io/projected/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-kube-api-access-zcbhq\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.790240 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-logs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.790369 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-config-data\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.790467 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.790545 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892576 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-public-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892671 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbhq\" (UniqueName: \"kubernetes.io/projected/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-kube-api-access-zcbhq\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892708 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-logs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892812 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-config-data\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892872 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.892899 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.893310 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-logs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.898058 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-config-data\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.898345 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.898704 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.899167 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-public-tls-certs\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.910017 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbhq\" (UniqueName: \"kubernetes.io/projected/13ff22ce-a715-4dba-aaa3-b6ba3a929d55-kube-api-access-zcbhq\") pod \"nova-api-0\" (UID: \"13ff22ce-a715-4dba-aaa3-b6ba3a929d55\") " pod="openstack/nova-api-0" Nov 25 09:21:51 crc kubenswrapper[4565]: I1125 09:21:51.956022 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.321099 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.409657 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle\") pod \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.410082 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5z2s\" (UniqueName: \"kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s\") pod \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.410167 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data\") pod \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\" (UID: \"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd\") " Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.414737 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s" (OuterVolumeSpecName: "kube-api-access-v5z2s") pod "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" (UID: "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd"). InnerVolumeSpecName "kube-api-access-v5z2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.434691 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data" (OuterVolumeSpecName: "config-data") pod "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" (UID: "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.440090 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" (UID: "111effe2-6ce0-48eb-a9bc-afbf3a9e41cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.514476 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5z2s\" (UniqueName: \"kubernetes.io/projected/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-kube-api-access-v5z2s\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.514528 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.514541 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.521063 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 09:21:52 crc kubenswrapper[4565]: W1125 09:21:52.526046 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13ff22ce_a715_4dba_aaa3_b6ba3a929d55.slice/crio-3597bdebb6afe288b8c77032429df7fa0c3dad7d5619bc20893da8921f23656d WatchSource:0}: Error finding container 3597bdebb6afe288b8c77032429df7fa0c3dad7d5619bc20893da8921f23656d: Status 404 returned error can't find the container with id 3597bdebb6afe288b8c77032429df7fa0c3dad7d5619bc20893da8921f23656d Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.587132 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13ff22ce-a715-4dba-aaa3-b6ba3a929d55","Type":"ContainerStarted","Data":"3597bdebb6afe288b8c77032429df7fa0c3dad7d5619bc20893da8921f23656d"} Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.589242 4565 generic.go:334] "Generic (PLEG): container finished" podID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" containerID="3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646" exitCode=0 Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.589316 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd","Type":"ContainerDied","Data":"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646"} Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.589351 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.589371 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"111effe2-6ce0-48eb-a9bc-afbf3a9e41cd","Type":"ContainerDied","Data":"d0980a2bbfef7f28c204ac1a31675d9aebdb1f5cd95791865713925b40c7bb05"} Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.589395 4565 scope.go:117] "RemoveContainer" containerID="3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.629893 4565 scope.go:117] "RemoveContainer" containerID="3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646" Nov 25 09:21:52 crc kubenswrapper[4565]: E1125 09:21:52.630279 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646\": container with ID starting with 3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646 not found: ID does not exist" containerID="3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.630407 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646"} err="failed to get container status \"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646\": rpc error: code = NotFound desc = could not find container \"3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646\": container with ID starting with 3c6c044b308277119e3e6b9055b7e46a8db7655619bf54699061c1cc02e54646 not found: ID does not exist" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.633360 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.642574 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.660620 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:52 crc kubenswrapper[4565]: E1125 09:21:52.661377 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" containerName="nova-scheduler-scheduler" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.661398 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" containerName="nova-scheduler-scheduler" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.661636 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" containerName="nova-scheduler-scheduler" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.662356 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.664542 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.670484 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.821667 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2zf\" (UniqueName: \"kubernetes.io/projected/c528e51d-34ba-402c-8985-b53beb776f43-kube-api-access-2m2zf\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.821845 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.822171 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-config-data\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.924301 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.924424 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-config-data\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.924520 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2zf\" (UniqueName: \"kubernetes.io/projected/c528e51d-34ba-402c-8985-b53beb776f43-kube-api-access-2m2zf\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.929263 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-config-data\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.932713 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c528e51d-34ba-402c-8985-b53beb776f43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.938590 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2zf\" (UniqueName: \"kubernetes.io/projected/c528e51d-34ba-402c-8985-b53beb776f43-kube-api-access-2m2zf\") pod \"nova-scheduler-0\" (UID: \"c528e51d-34ba-402c-8985-b53beb776f43\") " pod="openstack/nova-scheduler-0" Nov 25 09:21:52 crc kubenswrapper[4565]: I1125 09:21:52.978266 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.112133 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111effe2-6ce0-48eb-a9bc-afbf3a9e41cd" path="/var/lib/kubelet/pods/111effe2-6ce0-48eb-a9bc-afbf3a9e41cd/volumes" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.113049 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7d25a8-fd84-48aa-95d5-4299451f2fc7" path="/var/lib/kubelet/pods/fc7d25a8-fd84-48aa-95d5-4299451f2fc7/volumes" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.415125 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 09:21:53 crc kubenswrapper[4565]: W1125 09:21:53.416894 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc528e51d_34ba_402c_8985_b53beb776f43.slice/crio-3b2a8109ac7f0a9b312f44962f1739b7e5ed6818d1ff4684245cb43c7066f307 WatchSource:0}: Error finding container 3b2a8109ac7f0a9b312f44962f1739b7e5ed6818d1ff4684245cb43c7066f307: Status 404 returned error can't find the container with id 3b2a8109ac7f0a9b312f44962f1739b7e5ed6818d1ff4684245cb43c7066f307 Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.615026 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c528e51d-34ba-402c-8985-b53beb776f43","Type":"ContainerStarted","Data":"ba61df8aaac52a7b14011e77a1b978ac9f4acf2a7f8c1c668816791bbdc8b33d"} Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.615341 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c528e51d-34ba-402c-8985-b53beb776f43","Type":"ContainerStarted","Data":"3b2a8109ac7f0a9b312f44962f1739b7e5ed6818d1ff4684245cb43c7066f307"} Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.619245 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13ff22ce-a715-4dba-aaa3-b6ba3a929d55","Type":"ContainerStarted","Data":"7cb606850f9e7273531c2d1b923a782425dc187d541cda024c048acd18fc4a54"} Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.619297 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13ff22ce-a715-4dba-aaa3-b6ba3a929d55","Type":"ContainerStarted","Data":"b089d5899638d4016c3fd9493d208a516d19ae0e4ee4a1b6c888b31940fa0e52"} Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.643037 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.64301523 podStartE2EDuration="1.64301523s" podCreationTimestamp="2025-11-25 09:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:53.636036314 +0000 UTC m=+1046.838531453" watchObservedRunningTime="2025-11-25 09:21:53.64301523 +0000 UTC m=+1046.845510369" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.659687 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.659662058 podStartE2EDuration="2.659662058s" podCreationTimestamp="2025-11-25 09:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:53.65446885 +0000 UTC m=+1046.856963988" watchObservedRunningTime="2025-11-25 09:21:53.659662058 +0000 UTC m=+1046.862157187" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.960960 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:55756->10.217.0.177:8775: read: connection reset by peer" Nov 25 09:21:53 crc kubenswrapper[4565]: I1125 09:21:53.960985 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:55748->10.217.0.177:8775: read: connection reset by peer" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.301164 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.460716 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle\") pod \"5357da13-b37d-4661-8e17-45ddaf365687\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.460983 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn2s5\" (UniqueName: \"kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5\") pod \"5357da13-b37d-4661-8e17-45ddaf365687\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.461877 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs\") pod \"5357da13-b37d-4661-8e17-45ddaf365687\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.461919 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data\") pod \"5357da13-b37d-4661-8e17-45ddaf365687\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.462002 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs\") pod \"5357da13-b37d-4661-8e17-45ddaf365687\" (UID: \"5357da13-b37d-4661-8e17-45ddaf365687\") " Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.463136 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs" (OuterVolumeSpecName: "logs") pod "5357da13-b37d-4661-8e17-45ddaf365687" (UID: "5357da13-b37d-4661-8e17-45ddaf365687"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.466856 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5357da13-b37d-4661-8e17-45ddaf365687-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.483853 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5" (OuterVolumeSpecName: "kube-api-access-gn2s5") pod "5357da13-b37d-4661-8e17-45ddaf365687" (UID: "5357da13-b37d-4661-8e17-45ddaf365687"). InnerVolumeSpecName "kube-api-access-gn2s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.488569 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5357da13-b37d-4661-8e17-45ddaf365687" (UID: "5357da13-b37d-4661-8e17-45ddaf365687"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.491527 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data" (OuterVolumeSpecName: "config-data") pod "5357da13-b37d-4661-8e17-45ddaf365687" (UID: "5357da13-b37d-4661-8e17-45ddaf365687"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.523035 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5357da13-b37d-4661-8e17-45ddaf365687" (UID: "5357da13-b37d-4661-8e17-45ddaf365687"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.568749 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.568891 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn2s5\" (UniqueName: \"kubernetes.io/projected/5357da13-b37d-4661-8e17-45ddaf365687-kube-api-access-gn2s5\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.568967 4565 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.569032 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5357da13-b37d-4661-8e17-45ddaf365687-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.632048 4565 generic.go:334] "Generic (PLEG): container finished" podID="5357da13-b37d-4661-8e17-45ddaf365687" containerID="e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97" exitCode=0 Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.632308 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.632234 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerDied","Data":"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97"} Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.632490 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5357da13-b37d-4661-8e17-45ddaf365687","Type":"ContainerDied","Data":"c491dbdaa9a02316faa7220ca4f090009851479f1722b0f62f607733ce162cb7"} Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.633978 4565 scope.go:117] "RemoveContainer" containerID="e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.678556 4565 scope.go:117] "RemoveContainer" containerID="a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.686374 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.694607 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.701151 4565 scope.go:117] "RemoveContainer" containerID="e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97" Nov 25 09:21:54 crc kubenswrapper[4565]: E1125 09:21:54.701460 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97\": container with ID starting with e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97 not found: ID does not exist" containerID="e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.701488 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97"} err="failed to get container status \"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97\": rpc error: code = NotFound desc = could not find container \"e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97\": container with ID starting with e11dc25364c6a8f168fa41a6bd09ceb0034be345452328d179073dd20099ed97 not found: ID does not exist" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.701507 4565 scope.go:117] "RemoveContainer" containerID="a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0" Nov 25 09:21:54 crc kubenswrapper[4565]: E1125 09:21:54.701702 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0\": container with ID starting with a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0 not found: ID does not exist" containerID="a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.701721 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0"} err="failed to get container status \"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0\": rpc error: code = NotFound desc = could not find container \"a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0\": container with ID starting with a2b8225dd086b6c265877c517e76e573f7cc8334933bab963bcdfb3feada45f0 not found: ID does not exist" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.717080 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:54 crc kubenswrapper[4565]: E1125 09:21:54.717533 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.717556 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" Nov 25 09:21:54 crc kubenswrapper[4565]: E1125 09:21:54.717606 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.717613 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.717789 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-metadata" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.717837 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="5357da13-b37d-4661-8e17-45ddaf365687" containerName="nova-metadata-log" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.718878 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.721022 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.721271 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.725912 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.885424 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e090c67-47d1-445e-843f-4cf950699016-logs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.885671 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.885967 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-config-data\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.886089 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw544\" (UniqueName: \"kubernetes.io/projected/1e090c67-47d1-445e-843f-4cf950699016-kube-api-access-dw544\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.886237 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.988205 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e090c67-47d1-445e-843f-4cf950699016-logs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.988346 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.988734 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e090c67-47d1-445e-843f-4cf950699016-logs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.989223 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-config-data\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.989420 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw544\" (UniqueName: \"kubernetes.io/projected/1e090c67-47d1-445e-843f-4cf950699016-kube-api-access-dw544\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.989589 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.993133 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-config-data\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.993264 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:54 crc kubenswrapper[4565]: I1125 09:21:54.996346 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e090c67-47d1-445e-843f-4cf950699016-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.006083 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw544\" (UniqueName: \"kubernetes.io/projected/1e090c67-47d1-445e-843f-4cf950699016-kube-api-access-dw544\") pod \"nova-metadata-0\" (UID: \"1e090c67-47d1-445e-843f-4cf950699016\") " pod="openstack/nova-metadata-0" Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.033973 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.109577 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5357da13-b37d-4661-8e17-45ddaf365687" path="/var/lib/kubelet/pods/5357da13-b37d-4661-8e17-45ddaf365687/volumes" Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.454620 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 09:21:55 crc kubenswrapper[4565]: W1125 09:21:55.462825 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e090c67_47d1_445e_843f_4cf950699016.slice/crio-078d96ecc9e1a0c58971b57daf05d47408b2a2514e88451305b259a9274fea88 WatchSource:0}: Error finding container 078d96ecc9e1a0c58971b57daf05d47408b2a2514e88451305b259a9274fea88: Status 404 returned error can't find the container with id 078d96ecc9e1a0c58971b57daf05d47408b2a2514e88451305b259a9274fea88 Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.649378 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e090c67-47d1-445e-843f-4cf950699016","Type":"ContainerStarted","Data":"5717226d6303fa5214b40ad00f894d7591bd73edff82bd133c4088fbb753c090"} Nov 25 09:21:55 crc kubenswrapper[4565]: I1125 09:21:55.649656 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e090c67-47d1-445e-843f-4cf950699016","Type":"ContainerStarted","Data":"078d96ecc9e1a0c58971b57daf05d47408b2a2514e88451305b259a9274fea88"} Nov 25 09:21:56 crc kubenswrapper[4565]: I1125 09:21:56.660749 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e090c67-47d1-445e-843f-4cf950699016","Type":"ContainerStarted","Data":"e2e21f7a9f4b809e0ab831aa97de97f010cf6a0df9a62438dd108bef6268ffea"} Nov 25 09:21:56 crc kubenswrapper[4565]: I1125 09:21:56.686819 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.686803882 podStartE2EDuration="2.686803882s" podCreationTimestamp="2025-11-25 09:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:21:56.680443733 +0000 UTC m=+1049.882938871" watchObservedRunningTime="2025-11-25 09:21:56.686803882 +0000 UTC m=+1049.889299019" Nov 25 09:21:57 crc kubenswrapper[4565]: I1125 09:21:57.978847 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 09:22:00 crc kubenswrapper[4565]: I1125 09:22:00.034623 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 09:22:00 crc kubenswrapper[4565]: I1125 09:22:00.035047 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 09:22:01 crc kubenswrapper[4565]: I1125 09:22:01.956615 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:22:01 crc kubenswrapper[4565]: I1125 09:22:01.957943 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 09:22:02 crc kubenswrapper[4565]: I1125 09:22:02.973075 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="13ff22ce-a715-4dba-aaa3-b6ba3a929d55" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 09:22:02 crc kubenswrapper[4565]: I1125 09:22:02.973150 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="13ff22ce-a715-4dba-aaa3-b6ba3a929d55" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 09:22:02 crc kubenswrapper[4565]: I1125 09:22:02.979839 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 09:22:03 crc kubenswrapper[4565]: I1125 09:22:03.005637 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 09:22:03 crc kubenswrapper[4565]: I1125 09:22:03.797651 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 09:22:05 crc kubenswrapper[4565]: I1125 09:22:05.035077 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 09:22:05 crc kubenswrapper[4565]: I1125 09:22:05.035470 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 09:22:06 crc kubenswrapper[4565]: I1125 09:22:06.052100 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1e090c67-47d1-445e-843f-4cf950699016" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 09:22:06 crc kubenswrapper[4565]: I1125 09:22:06.052518 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1e090c67-47d1-445e-843f-4cf950699016" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 09:22:08 crc kubenswrapper[4565]: I1125 09:22:08.811276 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.966313 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.966746 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.968348 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.968375 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.973820 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 09:22:11 crc kubenswrapper[4565]: I1125 09:22:11.974114 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 09:22:15 crc kubenswrapper[4565]: I1125 09:22:15.041225 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 09:22:15 crc kubenswrapper[4565]: I1125 09:22:15.042134 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 09:22:15 crc kubenswrapper[4565]: I1125 09:22:15.047032 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 09:22:15 crc kubenswrapper[4565]: I1125 09:22:15.047424 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 09:22:21 crc kubenswrapper[4565]: I1125 09:22:21.107028 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:21 crc kubenswrapper[4565]: I1125 09:22:21.952039 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:25 crc kubenswrapper[4565]: I1125 09:22:25.099440 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:22:25 crc kubenswrapper[4565]: I1125 09:22:25.099857 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:22:25 crc kubenswrapper[4565]: I1125 09:22:25.368070 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="rabbitmq" containerID="cri-o://c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04" gracePeriod=604796 Nov 25 09:22:25 crc kubenswrapper[4565]: I1125 09:22:25.370360 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 25 09:22:25 crc kubenswrapper[4565]: I1125 09:22:25.778491 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="rabbitmq" containerID="cri-o://45b00b7971a56b1f1127cbd3fba4aae085b77e7b2f1cee486f8a50ffb35f30f7" gracePeriod=604797 Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.798029 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.815878 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.815916 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsp4z\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816009 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816096 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816201 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816331 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816810 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.816867 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.817015 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.817107 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.817142 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.817169 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd\") pod \"46428d34-ed8b-4584-954a-0c51d96b1c9c\" (UID: \"46428d34-ed8b-4584-954a-0c51d96b1c9c\") " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.817646 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.818052 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.818433 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.824829 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z" (OuterVolumeSpecName: "kube-api-access-wsp4z") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "kube-api-access-wsp4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.838307 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.842712 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info" (OuterVolumeSpecName: "pod-info") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.854012 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.856133 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.868500 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data" (OuterVolumeSpecName: "config-data") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.921074 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf" (OuterVolumeSpecName: "server-conf") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924619 4565 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46428d34-ed8b-4584-954a-0c51d96b1c9c-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924651 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924663 4565 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924673 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924690 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsp4z\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-kube-api-access-wsp4z\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924698 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924706 4565 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46428d34-ed8b-4584-954a-0c51d96b1c9c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924724 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.924732 4565 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46428d34-ed8b-4584-954a-0c51d96b1c9c-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:31 crc kubenswrapper[4565]: I1125 09:22:31.951601 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.009752 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "46428d34-ed8b-4584-954a-0c51d96b1c9c" (UID: "46428d34-ed8b-4584-954a-0c51d96b1c9c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.025895 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46428d34-ed8b-4584-954a-0c51d96b1c9c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.025948 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.036907 4565 generic.go:334] "Generic (PLEG): container finished" podID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerID="45b00b7971a56b1f1127cbd3fba4aae085b77e7b2f1cee486f8a50ffb35f30f7" exitCode=0 Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.036982 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerDied","Data":"45b00b7971a56b1f1127cbd3fba4aae085b77e7b2f1cee486f8a50ffb35f30f7"} Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.050235 4565 generic.go:334] "Generic (PLEG): container finished" podID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerID="c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04" exitCode=0 Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.050354 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerDied","Data":"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04"} Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.050445 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46428d34-ed8b-4584-954a-0c51d96b1c9c","Type":"ContainerDied","Data":"58abd32e4b679253016aba0bf57cb8e33dfff347156cf169ff79b1d690c6c4a3"} Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.050514 4565 scope.go:117] "RemoveContainer" containerID="c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.050719 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.095150 4565 scope.go:117] "RemoveContainer" containerID="e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.103081 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.124352 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.135649 4565 scope.go:117] "RemoveContainer" containerID="c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04" Nov 25 09:22:32 crc kubenswrapper[4565]: E1125 09:22:32.136252 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04\": container with ID starting with c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04 not found: ID does not exist" containerID="c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.136285 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04"} err="failed to get container status \"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04\": rpc error: code = NotFound desc = could not find container \"c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04\": container with ID starting with c0702f93885d28529a2af7990a35ab326532d86528c519f38b7a30e9aa51de04 not found: ID does not exist" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.136306 4565 scope.go:117] "RemoveContainer" containerID="e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c" Nov 25 09:22:32 crc kubenswrapper[4565]: E1125 09:22:32.137130 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c\": container with ID starting with e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c not found: ID does not exist" containerID="e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.137159 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c"} err="failed to get container status \"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c\": rpc error: code = NotFound desc = could not find container \"e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c\": container with ID starting with e4896b199da22c02f746a769e2a48931780200f33e23e7479d74075df91ee76c not found: ID does not exist" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.138601 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:32 crc kubenswrapper[4565]: E1125 09:22:32.139180 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="rabbitmq" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.139197 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="rabbitmq" Nov 25 09:22:32 crc kubenswrapper[4565]: E1125 09:22:32.139225 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="setup-container" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.139233 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="setup-container" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.139449 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" containerName="rabbitmq" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.140458 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.143640 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.143940 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.144122 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ss96w" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.144249 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.144363 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.144485 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.146798 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.148071 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.291634 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.336786 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.336842 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.336889 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337036 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjcfr\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-kube-api-access-tjcfr\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337115 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337145 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49a91aac-079e-475b-ac75-f400d2081405-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337164 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337222 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-config-data\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337240 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337322 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49a91aac-079e-475b-ac75-f400d2081405-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.337368 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.439894 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmk8\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.440788 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.440920 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.440983 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441005 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441027 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441058 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441080 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441107 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441147 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441185 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd\") pod \"b0cc10ca-7483-447d-a1ed-1566c994efdc\" (UID: \"b0cc10ca-7483-447d-a1ed-1566c994efdc\") " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441379 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49a91aac-079e-475b-ac75-f400d2081405-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441405 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441461 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-config-data\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441483 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441556 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49a91aac-079e-475b-ac75-f400d2081405-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441588 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441669 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441693 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441721 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441774 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjcfr\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-kube-api-access-tjcfr\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.441830 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.442776 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.442922 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.444566 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-config-data\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.444593 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.444906 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.446295 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49a91aac-079e-475b-ac75-f400d2081405-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.449257 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.450310 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8" (OuterVolumeSpecName: "kube-api-access-lpmk8") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "kube-api-access-lpmk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.451590 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.452708 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49a91aac-079e-475b-ac75-f400d2081405-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.453367 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.453976 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.457283 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.458146 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.462205 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49a91aac-079e-475b-ac75-f400d2081405-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.463335 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.467101 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info" (OuterVolumeSpecName: "pod-info") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.474799 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjcfr\" (UniqueName: \"kubernetes.io/projected/49a91aac-079e-475b-ac75-f400d2081405-kube-api-access-tjcfr\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.480169 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.485509 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"49a91aac-079e-475b-ac75-f400d2081405\") " pod="openstack/rabbitmq-server-0" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.496125 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data" (OuterVolumeSpecName: "config-data") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.509944 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf" (OuterVolumeSpecName: "server-conf") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543516 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmk8\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-kube-api-access-lpmk8\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543584 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543600 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543610 4565 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543619 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543628 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543636 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543645 4565 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0cc10ca-7483-447d-a1ed-1566c994efdc-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543654 4565 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0cc10ca-7483-447d-a1ed-1566c994efdc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.543661 4565 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0cc10ca-7483-447d-a1ed-1566c994efdc-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.564313 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.604073 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b0cc10ca-7483-447d-a1ed-1566c994efdc" (UID: "b0cc10ca-7483-447d-a1ed-1566c994efdc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.645337 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.645380 4565 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0cc10ca-7483-447d-a1ed-1566c994efdc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:32 crc kubenswrapper[4565]: I1125 09:22:32.758483 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.064869 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b0cc10ca-7483-447d-a1ed-1566c994efdc","Type":"ContainerDied","Data":"d7072647dc0086c0bc879c7e99d9f23cbe7986e40404f10cc02246d1d0026ddd"} Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.064964 4565 scope.go:117] "RemoveContainer" containerID="45b00b7971a56b1f1127cbd3fba4aae085b77e7b2f1cee486f8a50ffb35f30f7" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.064984 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.106579 4565 scope.go:117] "RemoveContainer" containerID="d94652e304634ec33bfb162b4c2b317c7742bd20ae9567f935e47470498b93ac" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.114129 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46428d34-ed8b-4584-954a-0c51d96b1c9c" path="/var/lib/kubelet/pods/46428d34-ed8b-4584-954a-0c51d96b1c9c/volumes" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.114983 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.124964 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.136892 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:33 crc kubenswrapper[4565]: E1125 09:22:33.137598 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="setup-container" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.137628 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="setup-container" Nov 25 09:22:33 crc kubenswrapper[4565]: E1125 09:22:33.137662 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="rabbitmq" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.137669 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="rabbitmq" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.138097 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" containerName="rabbitmq" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.139647 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.140486 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.144956 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.144985 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.145099 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.144994 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.145129 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.148446 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.148778 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xg26s" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.258409 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.262918 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263136 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263228 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263274 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263329 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263422 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n54f\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-kube-api-access-4n54f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263462 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263549 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263589 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263620 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.263657 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366658 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366716 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366751 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366776 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366812 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n54f\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-kube-api-access-4n54f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366844 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366871 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366892 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366916 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.366957 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.367045 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.367667 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.367913 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.368006 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.368241 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.368914 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.369243 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.377399 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.378404 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.379118 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.379547 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.384058 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n54f\" (UniqueName: \"kubernetes.io/projected/9914fdc4-3539-4d6b-97cf-e4c5330acfc0-kube-api-access-4n54f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.415422 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9914fdc4-3539-4d6b-97cf-e4c5330acfc0\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.463658 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:22:33 crc kubenswrapper[4565]: I1125 09:22:33.932323 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.032138 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.034767 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.041613 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.047992 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.119175 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49a91aac-079e-475b-ac75-f400d2081405","Type":"ContainerStarted","Data":"a102eabebab7dd7510ef9b11b55879e3ca460c9f85a7a6dda8e92aec842e81d9"} Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.123591 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9914fdc4-3539-4d6b-97cf-e4c5330acfc0","Type":"ContainerStarted","Data":"c6a33a4366246ccb7896bdb5250338f333c15d5a7a2cbc62290f754936fb98a1"} Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191004 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191081 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tckc4\" (UniqueName: \"kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191275 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191389 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191447 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.191475 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.293745 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.293855 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.293889 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.293916 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.293998 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.294040 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tckc4\" (UniqueName: \"kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.294698 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.294900 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.294981 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.295218 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.295286 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.396056 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tckc4\" (UniqueName: \"kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4\") pod \"dnsmasq-dns-568675b579-wb8j4\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.420716 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:34 crc kubenswrapper[4565]: I1125 09:22:34.849246 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:22:35 crc kubenswrapper[4565]: I1125 09:22:35.106690 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cc10ca-7483-447d-a1ed-1566c994efdc" path="/var/lib/kubelet/pods/b0cc10ca-7483-447d-a1ed-1566c994efdc/volumes" Nov 25 09:22:35 crc kubenswrapper[4565]: I1125 09:22:35.138145 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49a91aac-079e-475b-ac75-f400d2081405","Type":"ContainerStarted","Data":"df3befdb7948c477f286f4f7bf958ba6dd411b76b702e92a01b28ef9af29b40d"} Nov 25 09:22:35 crc kubenswrapper[4565]: I1125 09:22:35.140371 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568675b579-wb8j4" event={"ID":"07aec95c-ee3d-47fc-9893-5c4f792705ea","Type":"ContainerStarted","Data":"9e52425c6a13cc08b0524ca32e6a4a24726167dae14123949b33e50f02a26a45"} Nov 25 09:22:36 crc kubenswrapper[4565]: I1125 09:22:36.152764 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9914fdc4-3539-4d6b-97cf-e4c5330acfc0","Type":"ContainerStarted","Data":"1154336540deec443284793eb5635a1cde5190c2f11ec4769c682e8b99413726"} Nov 25 09:22:36 crc kubenswrapper[4565]: I1125 09:22:36.155191 4565 generic.go:334] "Generic (PLEG): container finished" podID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerID="ca00f4e1cefbd08d02a68f02d0589695b4e14545738cbf21761c916e353657a8" exitCode=0 Nov 25 09:22:36 crc kubenswrapper[4565]: I1125 09:22:36.155294 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568675b579-wb8j4" event={"ID":"07aec95c-ee3d-47fc-9893-5c4f792705ea","Type":"ContainerDied","Data":"ca00f4e1cefbd08d02a68f02d0589695b4e14545738cbf21761c916e353657a8"} Nov 25 09:22:37 crc kubenswrapper[4565]: I1125 09:22:37.165201 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568675b579-wb8j4" event={"ID":"07aec95c-ee3d-47fc-9893-5c4f792705ea","Type":"ContainerStarted","Data":"1e2ab61d95843731498a0bab6e504136d413f1377322db32b6f0293e9ab2ce84"} Nov 25 09:22:37 crc kubenswrapper[4565]: I1125 09:22:37.189852 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568675b579-wb8j4" podStartSLOduration=4.189831914 podStartE2EDuration="4.189831914s" podCreationTimestamp="2025-11-25 09:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:22:37.182485347 +0000 UTC m=+1090.384980485" watchObservedRunningTime="2025-11-25 09:22:37.189831914 +0000 UTC m=+1090.392327052" Nov 25 09:22:38 crc kubenswrapper[4565]: I1125 09:22:38.175092 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.422083 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.482685 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.483033 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="dnsmasq-dns" containerID="cri-o://9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c" gracePeriod=10 Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.717452 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.720605 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.747628 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814289 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814359 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814423 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814444 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814682 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvcc\" (UniqueName: \"kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.814843 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.916762 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.916835 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.916880 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.916951 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.916977 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.917083 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvcc\" (UniqueName: \"kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.917752 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.917786 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.918174 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.918541 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.919084 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:44 crc kubenswrapper[4565]: I1125 09:22:44.934583 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvcc\" (UniqueName: \"kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc\") pod \"dnsmasq-dns-858f54d499-ngjgw\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.033079 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.045969 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.120599 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb\") pod \"6b02475b-df47-4dcd-b04b-4f3294a87d56\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.120661 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config\") pod \"6b02475b-df47-4dcd-b04b-4f3294a87d56\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.120701 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc\") pod \"6b02475b-df47-4dcd-b04b-4f3294a87d56\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.120770 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx98n\" (UniqueName: \"kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n\") pod \"6b02475b-df47-4dcd-b04b-4f3294a87d56\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.120808 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb\") pod \"6b02475b-df47-4dcd-b04b-4f3294a87d56\" (UID: \"6b02475b-df47-4dcd-b04b-4f3294a87d56\") " Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.134767 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n" (OuterVolumeSpecName: "kube-api-access-fx98n") pod "6b02475b-df47-4dcd-b04b-4f3294a87d56" (UID: "6b02475b-df47-4dcd-b04b-4f3294a87d56"). InnerVolumeSpecName "kube-api-access-fx98n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.185416 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b02475b-df47-4dcd-b04b-4f3294a87d56" (UID: "6b02475b-df47-4dcd-b04b-4f3294a87d56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.214341 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b02475b-df47-4dcd-b04b-4f3294a87d56" (UID: "6b02475b-df47-4dcd-b04b-4f3294a87d56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.228523 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.228550 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.228563 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx98n\" (UniqueName: \"kubernetes.io/projected/6b02475b-df47-4dcd-b04b-4f3294a87d56-kube-api-access-fx98n\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.251613 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b02475b-df47-4dcd-b04b-4f3294a87d56" (UID: "6b02475b-df47-4dcd-b04b-4f3294a87d56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.260107 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config" (OuterVolumeSpecName: "config") pod "6b02475b-df47-4dcd-b04b-4f3294a87d56" (UID: "6b02475b-df47-4dcd-b04b-4f3294a87d56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.261692 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerID="9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c" exitCode=0 Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.261744 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" event={"ID":"6b02475b-df47-4dcd-b04b-4f3294a87d56","Type":"ContainerDied","Data":"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c"} Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.261778 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" event={"ID":"6b02475b-df47-4dcd-b04b-4f3294a87d56","Type":"ContainerDied","Data":"d825c291f039b63c053fcbc5485b06c4c5b95d2c34b9ffe9ffe6f74792147496"} Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.261799 4565 scope.go:117] "RemoveContainer" containerID="9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.262162 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.333585 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.333855 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b02475b-df47-4dcd-b04b-4f3294a87d56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.334849 4565 scope.go:117] "RemoveContainer" containerID="f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.348982 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.377055 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c9b558957-9nj2j"] Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.395074 4565 scope.go:117] "RemoveContainer" containerID="9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c" Nov 25 09:22:45 crc kubenswrapper[4565]: E1125 09:22:45.400692 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c\": container with ID starting with 9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c not found: ID does not exist" containerID="9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.400722 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c"} err="failed to get container status \"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c\": rpc error: code = NotFound desc = could not find container \"9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c\": container with ID starting with 9c6e0741d178b38e6e4916def35d1d7ba3fd60dc60fe1eead4701a4ff046d57c not found: ID does not exist" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.400744 4565 scope.go:117] "RemoveContainer" containerID="f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78" Nov 25 09:22:45 crc kubenswrapper[4565]: E1125 09:22:45.404634 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78\": container with ID starting with f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78 not found: ID does not exist" containerID="f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.404661 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78"} err="failed to get container status \"f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78\": rpc error: code = NotFound desc = could not find container \"f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78\": container with ID starting with f01ba2eadf32c3355721856c5758dd10f41e42091fdd103d533a407a659f3a78 not found: ID does not exist" Nov 25 09:22:45 crc kubenswrapper[4565]: I1125 09:22:45.512267 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:22:46 crc kubenswrapper[4565]: I1125 09:22:46.278130 4565 generic.go:334] "Generic (PLEG): container finished" podID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerID="cd631c5374a4dfbeb7aceaeb7997ae23a826c021fbecee1fb56f2511fd59603c" exitCode=0 Nov 25 09:22:46 crc kubenswrapper[4565]: I1125 09:22:46.278197 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" event={"ID":"a957b81e-1acc-4e1c-be9b-0c5be361ebb3","Type":"ContainerDied","Data":"cd631c5374a4dfbeb7aceaeb7997ae23a826c021fbecee1fb56f2511fd59603c"} Nov 25 09:22:46 crc kubenswrapper[4565]: I1125 09:22:46.278575 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" event={"ID":"a957b81e-1acc-4e1c-be9b-0c5be361ebb3","Type":"ContainerStarted","Data":"32c3237b462740ab9c03801a11150b3697e1f2d9542189744fad4aabf1ced20a"} Nov 25 09:22:47 crc kubenswrapper[4565]: I1125 09:22:47.107679 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" path="/var/lib/kubelet/pods/6b02475b-df47-4dcd-b04b-4f3294a87d56/volumes" Nov 25 09:22:47 crc kubenswrapper[4565]: I1125 09:22:47.292580 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" event={"ID":"a957b81e-1acc-4e1c-be9b-0c5be361ebb3","Type":"ContainerStarted","Data":"7237006e54da0b2b5355ea89f0f6624c100e6576d7d63fbac41bdccc12c3ac22"} Nov 25 09:22:47 crc kubenswrapper[4565]: I1125 09:22:47.292751 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:47 crc kubenswrapper[4565]: I1125 09:22:47.320721 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" podStartSLOduration=3.320699381 podStartE2EDuration="3.320699381s" podCreationTimestamp="2025-11-25 09:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:22:47.309461307 +0000 UTC m=+1100.511956445" watchObservedRunningTime="2025-11-25 09:22:47.320699381 +0000 UTC m=+1100.523194519" Nov 25 09:22:49 crc kubenswrapper[4565]: I1125 09:22:49.827570 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-c9b558957-9nj2j" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: i/o timeout" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.083830 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5"] Nov 25 09:22:50 crc kubenswrapper[4565]: E1125 09:22:50.084596 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="init" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.084636 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="init" Nov 25 09:22:50 crc kubenswrapper[4565]: E1125 09:22:50.084683 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="dnsmasq-dns" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.084693 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="dnsmasq-dns" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.085126 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b02475b-df47-4dcd-b04b-4f3294a87d56" containerName="dnsmasq-dns" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.086340 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.089235 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.090090 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.090133 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.092458 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5"] Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.097022 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.141257 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.141361 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.141389 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qt8h\" (UniqueName: \"kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.141451 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.243631 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.243700 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qt8h\" (UniqueName: \"kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.243824 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.244008 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.251684 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.251820 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.252031 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.258456 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qt8h\" (UniqueName: \"kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.419699 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:22:50 crc kubenswrapper[4565]: I1125 09:22:50.921994 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5"] Nov 25 09:22:51 crc kubenswrapper[4565]: I1125 09:22:51.331881 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" event={"ID":"0ac9f2c5-d89d-44d7-8138-875852e7c565","Type":"ContainerStarted","Data":"74396f4b6f13ae0303d16de1614214028f1f9725c522c7f59bf26d40835426bb"} Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.048125 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.100217 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.100262 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.113920 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.114160 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568675b579-wb8j4" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="dnsmasq-dns" containerID="cri-o://1e2ab61d95843731498a0bab6e504136d413f1377322db32b6f0293e9ab2ce84" gracePeriod=10 Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.381857 4565 generic.go:334] "Generic (PLEG): container finished" podID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerID="1e2ab61d95843731498a0bab6e504136d413f1377322db32b6f0293e9ab2ce84" exitCode=0 Nov 25 09:22:55 crc kubenswrapper[4565]: I1125 09:22:55.381892 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568675b579-wb8j4" event={"ID":"07aec95c-ee3d-47fc-9893-5c4f792705ea","Type":"ContainerDied","Data":"1e2ab61d95843731498a0bab6e504136d413f1377322db32b6f0293e9ab2ce84"} Nov 25 09:22:59 crc kubenswrapper[4565]: I1125 09:22:59.421684 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-568675b579-wb8j4" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.192:5353: connect: connection refused" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.572166 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.721610 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tckc4\" (UniqueName: \"kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.721919 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.722638 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.722715 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.722757 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.722805 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config\") pod \"07aec95c-ee3d-47fc-9893-5c4f792705ea\" (UID: \"07aec95c-ee3d-47fc-9893-5c4f792705ea\") " Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.731053 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4" (OuterVolumeSpecName: "kube-api-access-tckc4") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "kube-api-access-tckc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.767281 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.767794 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.768185 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config" (OuterVolumeSpecName: "config") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.768797 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.774586 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07aec95c-ee3d-47fc-9893-5c4f792705ea" (UID: "07aec95c-ee3d-47fc-9893-5c4f792705ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825394 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825422 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825434 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825445 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825455 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aec95c-ee3d-47fc-9893-5c4f792705ea-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:01 crc kubenswrapper[4565]: I1125 09:23:01.825465 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tckc4\" (UniqueName: \"kubernetes.io/projected/07aec95c-ee3d-47fc-9893-5c4f792705ea-kube-api-access-tckc4\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.447842 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568675b579-wb8j4" event={"ID":"07aec95c-ee3d-47fc-9893-5c4f792705ea","Type":"ContainerDied","Data":"9e52425c6a13cc08b0524ca32e6a4a24726167dae14123949b33e50f02a26a45"} Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.448156 4565 scope.go:117] "RemoveContainer" containerID="1e2ab61d95843731498a0bab6e504136d413f1377322db32b6f0293e9ab2ce84" Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.447876 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568675b579-wb8j4" Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.452382 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" event={"ID":"0ac9f2c5-d89d-44d7-8138-875852e7c565","Type":"ContainerStarted","Data":"216935a3513cec14380d02f4c7b97d9512d00f0ba5456773a1df1cf2e8b34651"} Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.477129 4565 scope.go:117] "RemoveContainer" containerID="ca00f4e1cefbd08d02a68f02d0589695b4e14545738cbf21761c916e353657a8" Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.478731 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" podStartSLOduration=2.096830581 podStartE2EDuration="12.47871803s" podCreationTimestamp="2025-11-25 09:22:50 +0000 UTC" firstStartedPulling="2025-11-25 09:22:50.9357417 +0000 UTC m=+1104.138236838" lastFinishedPulling="2025-11-25 09:23:01.317629148 +0000 UTC m=+1114.520124287" observedRunningTime="2025-11-25 09:23:02.476080227 +0000 UTC m=+1115.678575366" watchObservedRunningTime="2025-11-25 09:23:02.47871803 +0000 UTC m=+1115.681213168" Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.511609 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:23:02 crc kubenswrapper[4565]: I1125 09:23:02.518712 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568675b579-wb8j4"] Nov 25 09:23:03 crc kubenswrapper[4565]: I1125 09:23:03.111011 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" path="/var/lib/kubelet/pods/07aec95c-ee3d-47fc-9893-5c4f792705ea/volumes" Nov 25 09:23:06 crc kubenswrapper[4565]: I1125 09:23:06.487597 4565 generic.go:334] "Generic (PLEG): container finished" podID="49a91aac-079e-475b-ac75-f400d2081405" containerID="df3befdb7948c477f286f4f7bf958ba6dd411b76b702e92a01b28ef9af29b40d" exitCode=0 Nov 25 09:23:06 crc kubenswrapper[4565]: I1125 09:23:06.487635 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49a91aac-079e-475b-ac75-f400d2081405","Type":"ContainerDied","Data":"df3befdb7948c477f286f4f7bf958ba6dd411b76b702e92a01b28ef9af29b40d"} Nov 25 09:23:07 crc kubenswrapper[4565]: I1125 09:23:07.513068 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49a91aac-079e-475b-ac75-f400d2081405","Type":"ContainerStarted","Data":"23c7ef4a7b0d5be9a3bc0e9d03218970081be2fd3c7a686d71fd98bfe1a7b8a7"} Nov 25 09:23:07 crc kubenswrapper[4565]: I1125 09:23:07.516397 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 09:23:07 crc kubenswrapper[4565]: I1125 09:23:07.518735 4565 generic.go:334] "Generic (PLEG): container finished" podID="9914fdc4-3539-4d6b-97cf-e4c5330acfc0" containerID="1154336540deec443284793eb5635a1cde5190c2f11ec4769c682e8b99413726" exitCode=0 Nov 25 09:23:07 crc kubenswrapper[4565]: I1125 09:23:07.518957 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9914fdc4-3539-4d6b-97cf-e4c5330acfc0","Type":"ContainerDied","Data":"1154336540deec443284793eb5635a1cde5190c2f11ec4769c682e8b99413726"} Nov 25 09:23:07 crc kubenswrapper[4565]: I1125 09:23:07.550644 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.550626695 podStartE2EDuration="35.550626695s" podCreationTimestamp="2025-11-25 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:23:07.546451013 +0000 UTC m=+1120.748946141" watchObservedRunningTime="2025-11-25 09:23:07.550626695 +0000 UTC m=+1120.753121833" Nov 25 09:23:08 crc kubenswrapper[4565]: I1125 09:23:08.529239 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9914fdc4-3539-4d6b-97cf-e4c5330acfc0","Type":"ContainerStarted","Data":"19c1137ff3ca32829f31efe2b659f7cf5906b70ecfc36f44973f9cfd49609c59"} Nov 25 09:23:08 crc kubenswrapper[4565]: I1125 09:23:08.530559 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:23:08 crc kubenswrapper[4565]: I1125 09:23:08.550548 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.550530445 podStartE2EDuration="35.550530445s" podCreationTimestamp="2025-11-25 09:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:23:08.546722937 +0000 UTC m=+1121.749218076" watchObservedRunningTime="2025-11-25 09:23:08.550530445 +0000 UTC m=+1121.753025583" Nov 25 09:23:13 crc kubenswrapper[4565]: I1125 09:23:13.582892 4565 generic.go:334] "Generic (PLEG): container finished" podID="0ac9f2c5-d89d-44d7-8138-875852e7c565" containerID="216935a3513cec14380d02f4c7b97d9512d00f0ba5456773a1df1cf2e8b34651" exitCode=0 Nov 25 09:23:13 crc kubenswrapper[4565]: I1125 09:23:13.583013 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" event={"ID":"0ac9f2c5-d89d-44d7-8138-875852e7c565","Type":"ContainerDied","Data":"216935a3513cec14380d02f4c7b97d9512d00f0ba5456773a1df1cf2e8b34651"} Nov 25 09:23:14 crc kubenswrapper[4565]: I1125 09:23:14.969003 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.113270 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qt8h\" (UniqueName: \"kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h\") pod \"0ac9f2c5-d89d-44d7-8138-875852e7c565\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.113653 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory\") pod \"0ac9f2c5-d89d-44d7-8138-875852e7c565\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.113822 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle\") pod \"0ac9f2c5-d89d-44d7-8138-875852e7c565\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.113949 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key\") pod \"0ac9f2c5-d89d-44d7-8138-875852e7c565\" (UID: \"0ac9f2c5-d89d-44d7-8138-875852e7c565\") " Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.120206 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h" (OuterVolumeSpecName: "kube-api-access-5qt8h") pod "0ac9f2c5-d89d-44d7-8138-875852e7c565" (UID: "0ac9f2c5-d89d-44d7-8138-875852e7c565"). InnerVolumeSpecName "kube-api-access-5qt8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.121125 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0ac9f2c5-d89d-44d7-8138-875852e7c565" (UID: "0ac9f2c5-d89d-44d7-8138-875852e7c565"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.173600 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ac9f2c5-d89d-44d7-8138-875852e7c565" (UID: "0ac9f2c5-d89d-44d7-8138-875852e7c565"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.174940 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory" (OuterVolumeSpecName: "inventory") pod "0ac9f2c5-d89d-44d7-8138-875852e7c565" (UID: "0ac9f2c5-d89d-44d7-8138-875852e7c565"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.217772 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qt8h\" (UniqueName: \"kubernetes.io/projected/0ac9f2c5-d89d-44d7-8138-875852e7c565-kube-api-access-5qt8h\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.217812 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.217827 4565 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.217840 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac9f2c5-d89d-44d7-8138-875852e7c565-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.608515 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" event={"ID":"0ac9f2c5-d89d-44d7-8138-875852e7c565","Type":"ContainerDied","Data":"74396f4b6f13ae0303d16de1614214028f1f9725c522c7f59bf26d40835426bb"} Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.608822 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74396f4b6f13ae0303d16de1614214028f1f9725c522c7f59bf26d40835426bb" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.608921 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.704188 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw"] Nov 25 09:23:15 crc kubenswrapper[4565]: E1125 09:23:15.704886 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="dnsmasq-dns" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.704912 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="dnsmasq-dns" Nov 25 09:23:15 crc kubenswrapper[4565]: E1125 09:23:15.704970 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="init" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.704978 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="init" Nov 25 09:23:15 crc kubenswrapper[4565]: E1125 09:23:15.705011 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac9f2c5-d89d-44d7-8138-875852e7c565" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.705020 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac9f2c5-d89d-44d7-8138-875852e7c565" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.705311 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="07aec95c-ee3d-47fc-9893-5c4f792705ea" containerName="dnsmasq-dns" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.705340 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac9f2c5-d89d-44d7-8138-875852e7c565" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.706531 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.711948 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.712136 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.712287 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.712528 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.736997 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw"] Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.745359 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndkm\" (UniqueName: \"kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.745417 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.745645 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.745920 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.848108 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndkm\" (UniqueName: \"kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.848475 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.849707 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.850201 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.854628 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.854621 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.855439 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:15 crc kubenswrapper[4565]: I1125 09:23:15.863728 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndkm\" (UniqueName: \"kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:16 crc kubenswrapper[4565]: I1125 09:23:16.036452 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:23:16 crc kubenswrapper[4565]: I1125 09:23:16.524295 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw"] Nov 25 09:23:16 crc kubenswrapper[4565]: I1125 09:23:16.532525 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:23:16 crc kubenswrapper[4565]: I1125 09:23:16.622060 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" event={"ID":"2574f63c-b7b2-4edf-a3f3-808a81438878","Type":"ContainerStarted","Data":"0f0b09dc7908a72aee7f74d287a090f9357880b5f257a9e9ddc0f229b83f9c08"} Nov 25 09:23:17 crc kubenswrapper[4565]: I1125 09:23:17.633437 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" event={"ID":"2574f63c-b7b2-4edf-a3f3-808a81438878","Type":"ContainerStarted","Data":"3b6cb2ae6cfb621acc892b728bfdb02fea77c6cdc477a9198130152f57e41fcd"} Nov 25 09:23:22 crc kubenswrapper[4565]: I1125 09:23:22.761106 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 09:23:22 crc kubenswrapper[4565]: I1125 09:23:22.790344 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" podStartSLOduration=7.302592914 podStartE2EDuration="7.790326582s" podCreationTimestamp="2025-11-25 09:23:15 +0000 UTC" firstStartedPulling="2025-11-25 09:23:16.532252494 +0000 UTC m=+1129.734747632" lastFinishedPulling="2025-11-25 09:23:17.019986162 +0000 UTC m=+1130.222481300" observedRunningTime="2025-11-25 09:23:17.652849463 +0000 UTC m=+1130.855344601" watchObservedRunningTime="2025-11-25 09:23:22.790326582 +0000 UTC m=+1135.992821720" Nov 25 09:23:23 crc kubenswrapper[4565]: I1125 09:23:23.470132 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.099071 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.099490 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.108107 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.109013 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.109094 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2" gracePeriod=600 Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.715074 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2" exitCode=0 Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.715469 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2"} Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.715525 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf"} Nov 25 09:23:25 crc kubenswrapper[4565]: I1125 09:23:25.715545 4565 scope.go:117] "RemoveContainer" containerID="f8cbf4d6873b3025c789286654bea15427f510e52a9c9dafb2d1c58270be257d" Nov 25 09:25:25 crc kubenswrapper[4565]: I1125 09:25:25.099659 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:25:25 crc kubenswrapper[4565]: I1125 09:25:25.100582 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:25:34 crc kubenswrapper[4565]: I1125 09:25:34.263211 4565 scope.go:117] "RemoveContainer" containerID="7dc8e423018162e4767da1db52a8154d522ede88c87127b6cbc39e39f3b8acbf" Nov 25 09:25:34 crc kubenswrapper[4565]: I1125 09:25:34.307880 4565 scope.go:117] "RemoveContainer" containerID="12ee085019ef236d8049f29ca4d408b616bcc6469785d3af3a3aaba23ecb87af" Nov 25 09:25:55 crc kubenswrapper[4565]: I1125 09:25:55.099107 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:25:55 crc kubenswrapper[4565]: I1125 09:25:55.099817 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.099025 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.099668 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.106976 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.107519 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.107591 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf" gracePeriod=600 Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.415699 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf" exitCode=0 Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.415762 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf"} Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.416049 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d"} Nov 25 09:26:25 crc kubenswrapper[4565]: I1125 09:26:25.416083 4565 scope.go:117] "RemoveContainer" containerID="039cbb161cce07adb641b50f2f8a642843a69f44ac084886a6cded22a964aec2" Nov 25 09:26:44 crc kubenswrapper[4565]: I1125 09:26:44.592426 4565 generic.go:334] "Generic (PLEG): container finished" podID="2574f63c-b7b2-4edf-a3f3-808a81438878" containerID="3b6cb2ae6cfb621acc892b728bfdb02fea77c6cdc477a9198130152f57e41fcd" exitCode=0 Nov 25 09:26:44 crc kubenswrapper[4565]: I1125 09:26:44.592979 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" event={"ID":"2574f63c-b7b2-4edf-a3f3-808a81438878","Type":"ContainerDied","Data":"3b6cb2ae6cfb621acc892b728bfdb02fea77c6cdc477a9198130152f57e41fcd"} Nov 25 09:26:45 crc kubenswrapper[4565]: I1125 09:26:45.957520 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.158148 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle\") pod \"2574f63c-b7b2-4edf-a3f3-808a81438878\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.158248 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mndkm\" (UniqueName: \"kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm\") pod \"2574f63c-b7b2-4edf-a3f3-808a81438878\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.159072 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory\") pod \"2574f63c-b7b2-4edf-a3f3-808a81438878\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.159603 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key\") pod \"2574f63c-b7b2-4edf-a3f3-808a81438878\" (UID: \"2574f63c-b7b2-4edf-a3f3-808a81438878\") " Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.165916 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2574f63c-b7b2-4edf-a3f3-808a81438878" (UID: "2574f63c-b7b2-4edf-a3f3-808a81438878"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.166295 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm" (OuterVolumeSpecName: "kube-api-access-mndkm") pod "2574f63c-b7b2-4edf-a3f3-808a81438878" (UID: "2574f63c-b7b2-4edf-a3f3-808a81438878"). InnerVolumeSpecName "kube-api-access-mndkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.184489 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory" (OuterVolumeSpecName: "inventory") pod "2574f63c-b7b2-4edf-a3f3-808a81438878" (UID: "2574f63c-b7b2-4edf-a3f3-808a81438878"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.185609 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2574f63c-b7b2-4edf-a3f3-808a81438878" (UID: "2574f63c-b7b2-4edf-a3f3-808a81438878"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.264527 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.264566 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mndkm\" (UniqueName: \"kubernetes.io/projected/2574f63c-b7b2-4edf-a3f3-808a81438878-kube-api-access-mndkm\") on node \"crc\" DevicePath \"\"" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.264583 4565 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.264599 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2574f63c-b7b2-4edf-a3f3-808a81438878-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.615210 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" event={"ID":"2574f63c-b7b2-4edf-a3f3-808a81438878","Type":"ContainerDied","Data":"0f0b09dc7908a72aee7f74d287a090f9357880b5f257a9e9ddc0f229b83f9c08"} Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.615280 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0b09dc7908a72aee7f74d287a090f9357880b5f257a9e9ddc0f229b83f9c08" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.615388 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.694151 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st"] Nov 25 09:26:46 crc kubenswrapper[4565]: E1125 09:26:46.694920 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2574f63c-b7b2-4edf-a3f3-808a81438878" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.694957 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2574f63c-b7b2-4edf-a3f3-808a81438878" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.695199 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="2574f63c-b7b2-4edf-a3f3-808a81438878" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.695973 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.698035 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.698436 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.702021 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.706047 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st"] Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.706704 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.879519 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.879599 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.879775 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69nh\" (UniqueName: \"kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.982371 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.982440 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69nh\" (UniqueName: \"kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.982713 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.988467 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.992000 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:46 crc kubenswrapper[4565]: I1125 09:26:46.998668 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69nh\" (UniqueName: \"kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zj2st\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:47 crc kubenswrapper[4565]: I1125 09:26:47.009426 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:26:47 crc kubenswrapper[4565]: I1125 09:26:47.510635 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st"] Nov 25 09:26:47 crc kubenswrapper[4565]: I1125 09:26:47.627938 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" event={"ID":"f6043a07-8d34-4d17-84b7-71fcd378f0b6","Type":"ContainerStarted","Data":"20df443de821536690a9877972d3f1f7d94311e8f6d50bab8cb5f2c1d91bdda5"} Nov 25 09:26:48 crc kubenswrapper[4565]: I1125 09:26:48.644947 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" event={"ID":"f6043a07-8d34-4d17-84b7-71fcd378f0b6","Type":"ContainerStarted","Data":"72ac22149b770785f726966d3b3faf8bc91935ea780cf8513c3ed9a3e05794a4"} Nov 25 09:26:48 crc kubenswrapper[4565]: I1125 09:26:48.672436 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" podStartSLOduration=2.017397358 podStartE2EDuration="2.672415938s" podCreationTimestamp="2025-11-25 09:26:46 +0000 UTC" firstStartedPulling="2025-11-25 09:26:47.522326052 +0000 UTC m=+1340.724821181" lastFinishedPulling="2025-11-25 09:26:48.177344622 +0000 UTC m=+1341.379839761" observedRunningTime="2025-11-25 09:26:48.66760621 +0000 UTC m=+1341.870101348" watchObservedRunningTime="2025-11-25 09:26:48.672415938 +0000 UTC m=+1341.874911066" Nov 25 09:28:22 crc kubenswrapper[4565]: I1125 09:28:22.039218 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4bfd-account-create-cphjj"] Nov 25 09:28:22 crc kubenswrapper[4565]: I1125 09:28:22.055285 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4bfd-account-create-cphjj"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.040820 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ztdrc"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.048811 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5q2pv"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.055596 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fbd2-account-create-5l9xb"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.061847 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p6cvm"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.069659 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f100-account-create-sj2lj"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.074243 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p6cvm"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.078599 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ztdrc"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.082942 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fbd2-account-create-5l9xb"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.087247 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f100-account-create-sj2lj"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.091295 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5q2pv"] Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.105816 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d679723-ad82-4ea3-a423-46db64ebc105" path="/var/lib/kubelet/pods/1d679723-ad82-4ea3-a423-46db64ebc105/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.106580 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c48b8c0-09b0-4c38-94a6-9fc970eab3ca" path="/var/lib/kubelet/pods/3c48b8c0-09b0-4c38-94a6-9fc970eab3ca/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.107232 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b" path="/var/lib/kubelet/pods/4cf7c33b-48ca-41a9-9d40-c0d12e5fa07b/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.107812 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca742c16-4156-489a-8989-a25f91f6ef78" path="/var/lib/kubelet/pods/ca742c16-4156-489a-8989-a25f91f6ef78/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.108391 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8124d68-2866-4758-9728-769b860913ee" path="/var/lib/kubelet/pods/d8124d68-2866-4758-9728-769b860913ee/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.109477 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bb9655-fc2f-44fd-8910-541992e2896a" path="/var/lib/kubelet/pods/e7bb9655-fc2f-44fd-8910-541992e2896a/volumes" Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.605752 4565 generic.go:334] "Generic (PLEG): container finished" podID="f6043a07-8d34-4d17-84b7-71fcd378f0b6" containerID="72ac22149b770785f726966d3b3faf8bc91935ea780cf8513c3ed9a3e05794a4" exitCode=0 Nov 25 09:28:23 crc kubenswrapper[4565]: I1125 09:28:23.605840 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" event={"ID":"f6043a07-8d34-4d17-84b7-71fcd378f0b6","Type":"ContainerDied","Data":"72ac22149b770785f726966d3b3faf8bc91935ea780cf8513c3ed9a3e05794a4"} Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.910276 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.913961 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.935703 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.946723 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.946778 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:24 crc kubenswrapper[4565]: I1125 09:28:24.946916 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966b5\" (UniqueName: \"kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.000433 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.049546 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966b5\" (UniqueName: \"kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.049727 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.049817 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.050463 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.050475 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.070427 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966b5\" (UniqueName: \"kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5\") pod \"redhat-marketplace-hkhjn\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.099376 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.099426 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.151376 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n69nh\" (UniqueName: \"kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh\") pod \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.151761 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key\") pod \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.152055 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory\") pod \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\" (UID: \"f6043a07-8d34-4d17-84b7-71fcd378f0b6\") " Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.155951 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh" (OuterVolumeSpecName: "kube-api-access-n69nh") pod "f6043a07-8d34-4d17-84b7-71fcd378f0b6" (UID: "f6043a07-8d34-4d17-84b7-71fcd378f0b6"). InnerVolumeSpecName "kube-api-access-n69nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.183535 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory" (OuterVolumeSpecName: "inventory") pod "f6043a07-8d34-4d17-84b7-71fcd378f0b6" (UID: "f6043a07-8d34-4d17-84b7-71fcd378f0b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.197327 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6043a07-8d34-4d17-84b7-71fcd378f0b6" (UID: "f6043a07-8d34-4d17-84b7-71fcd378f0b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.256356 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n69nh\" (UniqueName: \"kubernetes.io/projected/f6043a07-8d34-4d17-84b7-71fcd378f0b6-kube-api-access-n69nh\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.256395 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.256413 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6043a07-8d34-4d17-84b7-71fcd378f0b6-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.311128 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.634849 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" event={"ID":"f6043a07-8d34-4d17-84b7-71fcd378f0b6","Type":"ContainerDied","Data":"20df443de821536690a9877972d3f1f7d94311e8f6d50bab8cb5f2c1d91bdda5"} Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.634922 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20df443de821536690a9877972d3f1f7d94311e8f6d50bab8cb5f2c1d91bdda5" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.635361 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.741725 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w"] Nov 25 09:28:25 crc kubenswrapper[4565]: E1125 09:28:25.742589 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6043a07-8d34-4d17-84b7-71fcd378f0b6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.742682 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6043a07-8d34-4d17-84b7-71fcd378f0b6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.743007 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6043a07-8d34-4d17-84b7-71fcd378f0b6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.743808 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.748454 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.748681 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.749020 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.749150 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.755410 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.764252 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w"] Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.876218 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.876309 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gj4\" (UniqueName: \"kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.876623 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.979598 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.979689 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gj4\" (UniqueName: \"kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.979800 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.987227 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:25 crc kubenswrapper[4565]: I1125 09:28:25.987333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.000247 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gj4\" (UniqueName: \"kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.062187 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.641092 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w"] Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.645126 4565 generic.go:334] "Generic (PLEG): container finished" podID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerID="e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732" exitCode=0 Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.645179 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerDied","Data":"e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732"} Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.645400 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerStarted","Data":"c9f48b224ae21afa5f9713654b2e57ffedf4dc35ca0f91ac629263066faf0852"} Nov 25 09:28:26 crc kubenswrapper[4565]: I1125 09:28:26.647651 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:28:27 crc kubenswrapper[4565]: I1125 09:28:27.130629 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:28:27 crc kubenswrapper[4565]: I1125 09:28:27.654991 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerStarted","Data":"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db"} Nov 25 09:28:27 crc kubenswrapper[4565]: I1125 09:28:27.656257 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" event={"ID":"3da40e24-5d26-4a34-a822-67953bfe3207","Type":"ContainerStarted","Data":"ff3cba882720548764d9b2963520e20956b57f03b7f6243fbdc164c580cea380"} Nov 25 09:28:27 crc kubenswrapper[4565]: I1125 09:28:27.656300 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" event={"ID":"3da40e24-5d26-4a34-a822-67953bfe3207","Type":"ContainerStarted","Data":"249a2b7a38d5f6377846538f70b846b31d38a528cd8b41d86cd359b1f8bb507e"} Nov 25 09:28:27 crc kubenswrapper[4565]: I1125 09:28:27.682679 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" podStartSLOduration=2.213249034 podStartE2EDuration="2.682668131s" podCreationTimestamp="2025-11-25 09:28:25 +0000 UTC" firstStartedPulling="2025-11-25 09:28:26.657856034 +0000 UTC m=+1439.860351171" lastFinishedPulling="2025-11-25 09:28:27.12727513 +0000 UTC m=+1440.329770268" observedRunningTime="2025-11-25 09:28:27.679349714 +0000 UTC m=+1440.881844843" watchObservedRunningTime="2025-11-25 09:28:27.682668131 +0000 UTC m=+1440.885163269" Nov 25 09:28:28 crc kubenswrapper[4565]: I1125 09:28:28.667742 4565 generic.go:334] "Generic (PLEG): container finished" podID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerID="5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db" exitCode=0 Nov 25 09:28:28 crc kubenswrapper[4565]: I1125 09:28:28.668004 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerDied","Data":"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db"} Nov 25 09:28:29 crc kubenswrapper[4565]: I1125 09:28:29.678353 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerStarted","Data":"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890"} Nov 25 09:28:29 crc kubenswrapper[4565]: I1125 09:28:29.700732 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hkhjn" podStartSLOduration=3.185619525 podStartE2EDuration="5.700712809s" podCreationTimestamp="2025-11-25 09:28:24 +0000 UTC" firstStartedPulling="2025-11-25 09:28:26.647416803 +0000 UTC m=+1439.849911941" lastFinishedPulling="2025-11-25 09:28:29.162510087 +0000 UTC m=+1442.365005225" observedRunningTime="2025-11-25 09:28:29.693989744 +0000 UTC m=+1442.896484883" watchObservedRunningTime="2025-11-25 09:28:29.700712809 +0000 UTC m=+1442.903207946" Nov 25 09:28:31 crc kubenswrapper[4565]: I1125 09:28:31.696807 4565 generic.go:334] "Generic (PLEG): container finished" podID="3da40e24-5d26-4a34-a822-67953bfe3207" containerID="ff3cba882720548764d9b2963520e20956b57f03b7f6243fbdc164c580cea380" exitCode=0 Nov 25 09:28:31 crc kubenswrapper[4565]: I1125 09:28:31.696900 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" event={"ID":"3da40e24-5d26-4a34-a822-67953bfe3207","Type":"ContainerDied","Data":"ff3cba882720548764d9b2963520e20956b57f03b7f6243fbdc164c580cea380"} Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.065269 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.254275 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7gj4\" (UniqueName: \"kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4\") pod \"3da40e24-5d26-4a34-a822-67953bfe3207\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.254505 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key\") pod \"3da40e24-5d26-4a34-a822-67953bfe3207\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.254580 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory\") pod \"3da40e24-5d26-4a34-a822-67953bfe3207\" (UID: \"3da40e24-5d26-4a34-a822-67953bfe3207\") " Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.266053 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4" (OuterVolumeSpecName: "kube-api-access-n7gj4") pod "3da40e24-5d26-4a34-a822-67953bfe3207" (UID: "3da40e24-5d26-4a34-a822-67953bfe3207"). InnerVolumeSpecName "kube-api-access-n7gj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.281117 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3da40e24-5d26-4a34-a822-67953bfe3207" (UID: "3da40e24-5d26-4a34-a822-67953bfe3207"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.286266 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory" (OuterVolumeSpecName: "inventory") pod "3da40e24-5d26-4a34-a822-67953bfe3207" (UID: "3da40e24-5d26-4a34-a822-67953bfe3207"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.357449 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.357476 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da40e24-5d26-4a34-a822-67953bfe3207-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.357487 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7gj4\" (UniqueName: \"kubernetes.io/projected/3da40e24-5d26-4a34-a822-67953bfe3207-kube-api-access-n7gj4\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.721386 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" event={"ID":"3da40e24-5d26-4a34-a822-67953bfe3207","Type":"ContainerDied","Data":"249a2b7a38d5f6377846538f70b846b31d38a528cd8b41d86cd359b1f8bb507e"} Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.721466 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249a2b7a38d5f6377846538f70b846b31d38a528cd8b41d86cd359b1f8bb507e" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.721471 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.788659 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff"] Nov 25 09:28:33 crc kubenswrapper[4565]: E1125 09:28:33.789027 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da40e24-5d26-4a34-a822-67953bfe3207" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.789047 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da40e24-5d26-4a34-a822-67953bfe3207" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.789214 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da40e24-5d26-4a34-a822-67953bfe3207" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.789786 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.797396 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.797658 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.798321 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.798469 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.806167 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff"] Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.867887 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdvq\" (UniqueName: \"kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.867973 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.868141 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.970413 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdvq\" (UniqueName: \"kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.970808 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.970899 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.978063 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.979438 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:33 crc kubenswrapper[4565]: I1125 09:28:33.985584 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdvq\" (UniqueName: \"kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-689ff\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.109267 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.432125 4565 scope.go:117] "RemoveContainer" containerID="a34c4cfe9cd9fe86d1bce765c69b87614e8dce9aff0e02130376a298dea7ab51" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.455344 4565 scope.go:117] "RemoveContainer" containerID="f85e02a97582edb6076c9c3a9446e9b8296e9dc8468cb9e9a32cfaa151dd5441" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.493626 4565 scope.go:117] "RemoveContainer" containerID="543858316b8b49fb47d3fe9c8bd5ad5195e17d16d0f91fad569631c21fb9476c" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.527791 4565 scope.go:117] "RemoveContainer" containerID="f4ade216b44359441d097c21ab05b852dc569e854fbb2d38fca4dde013562a68" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.551192 4565 scope.go:117] "RemoveContainer" containerID="56cffca7d8be4f95f11f1401ddb5d6785553f52070afa653c8ae083e887ca502" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.570539 4565 scope.go:117] "RemoveContainer" containerID="2c02fd4fc7b83a3308e6cfff9270bdad297a061340017877b8c47702cb174d34" Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.625159 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff"] Nov 25 09:28:34 crc kubenswrapper[4565]: I1125 09:28:34.731861 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" event={"ID":"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe","Type":"ContainerStarted","Data":"29d967d6d3de4cba80be458e5a7974a88530e8590920a3dae9c64477a00e6617"} Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.312186 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.312442 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.361085 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.743472 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" event={"ID":"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe","Type":"ContainerStarted","Data":"3a7b58c76992e9ac43411a2c966c07e745cd234be0fa4cd96d543e695fdd3a64"} Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.764544 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" podStartSLOduration=2.237779697 podStartE2EDuration="2.764522896s" podCreationTimestamp="2025-11-25 09:28:33 +0000 UTC" firstStartedPulling="2025-11-25 09:28:34.638252013 +0000 UTC m=+1447.840747151" lastFinishedPulling="2025-11-25 09:28:35.164995211 +0000 UTC m=+1448.367490350" observedRunningTime="2025-11-25 09:28:35.761631956 +0000 UTC m=+1448.964127095" watchObservedRunningTime="2025-11-25 09:28:35.764522896 +0000 UTC m=+1448.967018034" Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.792996 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:35 crc kubenswrapper[4565]: I1125 09:28:35.842642 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:37 crc kubenswrapper[4565]: I1125 09:28:37.762731 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hkhjn" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="registry-server" containerID="cri-o://c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890" gracePeriod=2 Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.175692 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.368453 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content\") pod \"847b22b8-f192-461f-84bc-9133ebe4cfbc\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.369144 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities\") pod \"847b22b8-f192-461f-84bc-9133ebe4cfbc\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.369502 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-966b5\" (UniqueName: \"kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5\") pod \"847b22b8-f192-461f-84bc-9133ebe4cfbc\" (UID: \"847b22b8-f192-461f-84bc-9133ebe4cfbc\") " Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.370306 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities" (OuterVolumeSpecName: "utilities") pod "847b22b8-f192-461f-84bc-9133ebe4cfbc" (UID: "847b22b8-f192-461f-84bc-9133ebe4cfbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.370510 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.380264 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5" (OuterVolumeSpecName: "kube-api-access-966b5") pod "847b22b8-f192-461f-84bc-9133ebe4cfbc" (UID: "847b22b8-f192-461f-84bc-9133ebe4cfbc"). InnerVolumeSpecName "kube-api-access-966b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.384859 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "847b22b8-f192-461f-84bc-9133ebe4cfbc" (UID: "847b22b8-f192-461f-84bc-9133ebe4cfbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.472500 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847b22b8-f192-461f-84bc-9133ebe4cfbc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.472539 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-966b5\" (UniqueName: \"kubernetes.io/projected/847b22b8-f192-461f-84bc-9133ebe4cfbc-kube-api-access-966b5\") on node \"crc\" DevicePath \"\"" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.774238 4565 generic.go:334] "Generic (PLEG): container finished" podID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerID="c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890" exitCode=0 Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.774289 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerDied","Data":"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890"} Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.774329 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkhjn" event={"ID":"847b22b8-f192-461f-84bc-9133ebe4cfbc","Type":"ContainerDied","Data":"c9f48b224ae21afa5f9713654b2e57ffedf4dc35ca0f91ac629263066faf0852"} Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.774351 4565 scope.go:117] "RemoveContainer" containerID="c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.774517 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkhjn" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.803916 4565 scope.go:117] "RemoveContainer" containerID="5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.813310 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.823455 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkhjn"] Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.848002 4565 scope.go:117] "RemoveContainer" containerID="e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.880807 4565 scope.go:117] "RemoveContainer" containerID="c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890" Nov 25 09:28:38 crc kubenswrapper[4565]: E1125 09:28:38.881459 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890\": container with ID starting with c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890 not found: ID does not exist" containerID="c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.881505 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890"} err="failed to get container status \"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890\": rpc error: code = NotFound desc = could not find container \"c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890\": container with ID starting with c1690f4d6808bf84b4192225cd0a48fbf37a0d817b440c6e1528e74cf5404890 not found: ID does not exist" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.881537 4565 scope.go:117] "RemoveContainer" containerID="5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db" Nov 25 09:28:38 crc kubenswrapper[4565]: E1125 09:28:38.881872 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db\": container with ID starting with 5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db not found: ID does not exist" containerID="5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.881896 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db"} err="failed to get container status \"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db\": rpc error: code = NotFound desc = could not find container \"5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db\": container with ID starting with 5e70a3597a5a0b5cd9a3bbebefdfc0a08a669659771bb32d8bb9a027df1370db not found: ID does not exist" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.881918 4565 scope.go:117] "RemoveContainer" containerID="e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732" Nov 25 09:28:38 crc kubenswrapper[4565]: E1125 09:28:38.882520 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732\": container with ID starting with e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732 not found: ID does not exist" containerID="e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732" Nov 25 09:28:38 crc kubenswrapper[4565]: I1125 09:28:38.882570 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732"} err="failed to get container status \"e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732\": rpc error: code = NotFound desc = could not find container \"e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732\": container with ID starting with e6b56255d18e2dddc53354ce9c671de03ae1b8eddecfe21ca473a4b374ee2732 not found: ID does not exist" Nov 25 09:28:39 crc kubenswrapper[4565]: I1125 09:28:39.108658 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" path="/var/lib/kubelet/pods/847b22b8-f192-461f-84bc-9133ebe4cfbc/volumes" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.039823 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pmdd7"] Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.047568 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pmdd7"] Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.112631 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405219a8-725f-4996-9efa-02837290d5e8" path="/var/lib/kubelet/pods/405219a8-725f-4996-9efa-02837290d5e8/volumes" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.663086 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:28:47 crc kubenswrapper[4565]: E1125 09:28:47.663882 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="registry-server" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.663910 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="registry-server" Nov 25 09:28:47 crc kubenswrapper[4565]: E1125 09:28:47.663954 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="extract-utilities" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.663963 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="extract-utilities" Nov 25 09:28:47 crc kubenswrapper[4565]: E1125 09:28:47.663981 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="extract-content" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.663988 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="extract-content" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.664232 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="847b22b8-f192-461f-84bc-9133ebe4cfbc" containerName="registry-server" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.665859 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.678316 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.678503 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmx8s\" (UniqueName: \"kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.678627 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.680030 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.779992 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmx8s\" (UniqueName: \"kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.780048 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.780157 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.780790 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.780833 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.815220 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmx8s\" (UniqueName: \"kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s\") pod \"community-operators-vb7kf\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:47 crc kubenswrapper[4565]: I1125 09:28:47.998546 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:48 crc kubenswrapper[4565]: I1125 09:28:48.511781 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:28:48 crc kubenswrapper[4565]: I1125 09:28:48.872796 4565 generic.go:334] "Generic (PLEG): container finished" podID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerID="b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61" exitCode=0 Nov 25 09:28:48 crc kubenswrapper[4565]: I1125 09:28:48.872851 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerDied","Data":"b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61"} Nov 25 09:28:48 crc kubenswrapper[4565]: I1125 09:28:48.872882 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerStarted","Data":"0ce604bf58db9bad939c840ff21acb48c97a4913b58d84c8939654f670f03dd2"} Nov 25 09:28:49 crc kubenswrapper[4565]: I1125 09:28:49.882392 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerStarted","Data":"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75"} Nov 25 09:28:50 crc kubenswrapper[4565]: E1125 09:28:50.626906 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfafe6cf9_152c_4f70_871c_356528bad9a3.slice/crio-conmon-de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfafe6cf9_152c_4f70_871c_356528bad9a3.slice/crio-de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75.scope\": RecentStats: unable to find data in memory cache]" Nov 25 09:28:50 crc kubenswrapper[4565]: I1125 09:28:50.892068 4565 generic.go:334] "Generic (PLEG): container finished" podID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerID="de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75" exitCode=0 Nov 25 09:28:50 crc kubenswrapper[4565]: I1125 09:28:50.892199 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerDied","Data":"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75"} Nov 25 09:28:51 crc kubenswrapper[4565]: I1125 09:28:51.904852 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerStarted","Data":"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c"} Nov 25 09:28:51 crc kubenswrapper[4565]: I1125 09:28:51.924238 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vb7kf" podStartSLOduration=2.460512686 podStartE2EDuration="4.924214954s" podCreationTimestamp="2025-11-25 09:28:47 +0000 UTC" firstStartedPulling="2025-11-25 09:28:48.874822587 +0000 UTC m=+1462.077317725" lastFinishedPulling="2025-11-25 09:28:51.338524855 +0000 UTC m=+1464.541019993" observedRunningTime="2025-11-25 09:28:51.923011495 +0000 UTC m=+1465.125506632" watchObservedRunningTime="2025-11-25 09:28:51.924214954 +0000 UTC m=+1465.126710092" Nov 25 09:28:52 crc kubenswrapper[4565]: I1125 09:28:52.038313 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fs8dc"] Nov 25 09:28:52 crc kubenswrapper[4565]: I1125 09:28:52.047179 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fs8dc"] Nov 25 09:28:52 crc kubenswrapper[4565]: I1125 09:28:52.054344 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4cj9f"] Nov 25 09:28:52 crc kubenswrapper[4565]: I1125 09:28:52.072168 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4cj9f"] Nov 25 09:28:53 crc kubenswrapper[4565]: I1125 09:28:53.035720 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-c2zj4"] Nov 25 09:28:53 crc kubenswrapper[4565]: I1125 09:28:53.045990 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-c2zj4"] Nov 25 09:28:53 crc kubenswrapper[4565]: I1125 09:28:53.110779 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f29e809-94fa-4ff1-85d9-b1a6786a1763" path="/var/lib/kubelet/pods/0f29e809-94fa-4ff1-85d9-b1a6786a1763/volumes" Nov 25 09:28:53 crc kubenswrapper[4565]: I1125 09:28:53.111354 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a355a7eb-1738-4c1b-975b-767821b77d5a" path="/var/lib/kubelet/pods/a355a7eb-1738-4c1b-975b-767821b77d5a/volumes" Nov 25 09:28:53 crc kubenswrapper[4565]: I1125 09:28:53.111850 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26517de-faae-4251-84d1-e9626a575d10" path="/var/lib/kubelet/pods/d26517de-faae-4251-84d1-e9626a575d10/volumes" Nov 25 09:28:55 crc kubenswrapper[4565]: I1125 09:28:55.025851 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dbb-account-create-gq67j"] Nov 25 09:28:55 crc kubenswrapper[4565]: I1125 09:28:55.034746 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5dbb-account-create-gq67j"] Nov 25 09:28:55 crc kubenswrapper[4565]: I1125 09:28:55.099436 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:28:55 crc kubenswrapper[4565]: I1125 09:28:55.099510 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:28:55 crc kubenswrapper[4565]: I1125 09:28:55.111272 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b86a550-d12f-4a27-abae-e115391cbb13" path="/var/lib/kubelet/pods/6b86a550-d12f-4a27-abae-e115391cbb13/volumes" Nov 25 09:28:56 crc kubenswrapper[4565]: I1125 09:28:56.031661 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0b9e-account-create-ztz2x"] Nov 25 09:28:56 crc kubenswrapper[4565]: I1125 09:28:56.037436 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-19b0-account-create-g2zxc"] Nov 25 09:28:56 crc kubenswrapper[4565]: I1125 09:28:56.042422 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0b9e-account-create-ztz2x"] Nov 25 09:28:56 crc kubenswrapper[4565]: I1125 09:28:56.048002 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-19b0-account-create-g2zxc"] Nov 25 09:28:57 crc kubenswrapper[4565]: I1125 09:28:57.136683 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2005d542-3de1-4147-ad79-56126eccafdd" path="/var/lib/kubelet/pods/2005d542-3de1-4147-ad79-56126eccafdd/volumes" Nov 25 09:28:57 crc kubenswrapper[4565]: I1125 09:28:57.137588 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2d8baf-2d1d-463c-b340-04a7a920de12" path="/var/lib/kubelet/pods/dc2d8baf-2d1d-463c-b340-04a7a920de12/volumes" Nov 25 09:28:57 crc kubenswrapper[4565]: I1125 09:28:57.999048 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:57 crc kubenswrapper[4565]: I1125 09:28:57.999381 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.038003 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.457613 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.459556 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.471140 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.544492 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.544545 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.544744 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqkz7\" (UniqueName: \"kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.646752 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.646797 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.646844 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkz7\" (UniqueName: \"kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.647411 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.647477 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.664288 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkz7\" (UniqueName: \"kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7\") pod \"redhat-operators-hvm2d\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:58 crc kubenswrapper[4565]: I1125 09:28:58.779294 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:28:59 crc kubenswrapper[4565]: I1125 09:28:59.040760 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bssz7"] Nov 25 09:28:59 crc kubenswrapper[4565]: I1125 09:28:59.063749 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bssz7"] Nov 25 09:28:59 crc kubenswrapper[4565]: I1125 09:28:59.081784 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:28:59 crc kubenswrapper[4565]: I1125 09:28:59.105584 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df05272-b4de-41ac-8d18-16c29398c0d4" path="/var/lib/kubelet/pods/2df05272-b4de-41ac-8d18-16c29398c0d4/volumes" Nov 25 09:28:59 crc kubenswrapper[4565]: I1125 09:28:59.312398 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:29:00 crc kubenswrapper[4565]: I1125 09:29:00.003595 4565 generic.go:334] "Generic (PLEG): container finished" podID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerID="4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9" exitCode=0 Nov 25 09:29:00 crc kubenswrapper[4565]: I1125 09:29:00.005231 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerDied","Data":"4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9"} Nov 25 09:29:00 crc kubenswrapper[4565]: I1125 09:29:00.005328 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerStarted","Data":"8528998afe2a69316da09fdd4325b9f51ace753f5d34d9db51bb65455479b64d"} Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.462880 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.463734 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vb7kf" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="registry-server" containerID="cri-o://d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c" gracePeriod=2 Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.879480 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.924194 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities\") pod \"fafe6cf9-152c-4f70-871c-356528bad9a3\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.924263 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmx8s\" (UniqueName: \"kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s\") pod \"fafe6cf9-152c-4f70-871c-356528bad9a3\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.924354 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content\") pod \"fafe6cf9-152c-4f70-871c-356528bad9a3\" (UID: \"fafe6cf9-152c-4f70-871c-356528bad9a3\") " Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.925134 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities" (OuterVolumeSpecName: "utilities") pod "fafe6cf9-152c-4f70-871c-356528bad9a3" (UID: "fafe6cf9-152c-4f70-871c-356528bad9a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:29:01 crc kubenswrapper[4565]: I1125 09:29:01.936302 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s" (OuterVolumeSpecName: "kube-api-access-tmx8s") pod "fafe6cf9-152c-4f70-871c-356528bad9a3" (UID: "fafe6cf9-152c-4f70-871c-356528bad9a3"). InnerVolumeSpecName "kube-api-access-tmx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.003619 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fafe6cf9-152c-4f70-871c-356528bad9a3" (UID: "fafe6cf9-152c-4f70-871c-356528bad9a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.026383 4565 generic.go:334] "Generic (PLEG): container finished" podID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerID="d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c" exitCode=0 Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.026455 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerDied","Data":"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c"} Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.026470 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb7kf" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.026497 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb7kf" event={"ID":"fafe6cf9-152c-4f70-871c-356528bad9a3","Type":"ContainerDied","Data":"0ce604bf58db9bad939c840ff21acb48c97a4913b58d84c8939654f670f03dd2"} Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.026541 4565 scope.go:117] "RemoveContainer" containerID="d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.027746 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.027761 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmx8s\" (UniqueName: \"kubernetes.io/projected/fafe6cf9-152c-4f70-871c-356528bad9a3-kube-api-access-tmx8s\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.027774 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fafe6cf9-152c-4f70-871c-356528bad9a3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.028622 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerStarted","Data":"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f"} Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.044605 4565 scope.go:117] "RemoveContainer" containerID="de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.068518 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.074074 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vb7kf"] Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.078284 4565 scope.go:117] "RemoveContainer" containerID="b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.106471 4565 scope.go:117] "RemoveContainer" containerID="d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c" Nov 25 09:29:02 crc kubenswrapper[4565]: E1125 09:29:02.107259 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c\": container with ID starting with d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c not found: ID does not exist" containerID="d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.107293 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c"} err="failed to get container status \"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c\": rpc error: code = NotFound desc = could not find container \"d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c\": container with ID starting with d9f0c6ba1b8c409252b72d0c3fbc56968e2c68259cfa09d7ab588d5fc15d529c not found: ID does not exist" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.107320 4565 scope.go:117] "RemoveContainer" containerID="de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75" Nov 25 09:29:02 crc kubenswrapper[4565]: E1125 09:29:02.107881 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75\": container with ID starting with de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75 not found: ID does not exist" containerID="de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.107972 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75"} err="failed to get container status \"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75\": rpc error: code = NotFound desc = could not find container \"de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75\": container with ID starting with de77c682c3f625fa25cfe29ec490097e35d64038adf243bb14c4d24cf2d89c75 not found: ID does not exist" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.108018 4565 scope.go:117] "RemoveContainer" containerID="b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61" Nov 25 09:29:02 crc kubenswrapper[4565]: E1125 09:29:02.108446 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61\": container with ID starting with b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61 not found: ID does not exist" containerID="b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61" Nov 25 09:29:02 crc kubenswrapper[4565]: I1125 09:29:02.108475 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61"} err="failed to get container status \"b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61\": rpc error: code = NotFound desc = could not find container \"b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61\": container with ID starting with b8c07b40a7603a234fed89b1a11fbc6459a81658ae6fc41f7279e8bacc6efc61 not found: ID does not exist" Nov 25 09:29:03 crc kubenswrapper[4565]: I1125 09:29:03.117995 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" path="/var/lib/kubelet/pods/fafe6cf9-152c-4f70-871c-356528bad9a3/volumes" Nov 25 09:29:04 crc kubenswrapper[4565]: I1125 09:29:04.049199 4565 generic.go:334] "Generic (PLEG): container finished" podID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerID="31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f" exitCode=0 Nov 25 09:29:04 crc kubenswrapper[4565]: I1125 09:29:04.049263 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerDied","Data":"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f"} Nov 25 09:29:05 crc kubenswrapper[4565]: I1125 09:29:05.062952 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerStarted","Data":"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5"} Nov 25 09:29:05 crc kubenswrapper[4565]: I1125 09:29:05.089347 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvm2d" podStartSLOduration=2.547658884 podStartE2EDuration="7.089331347s" podCreationTimestamp="2025-11-25 09:28:58 +0000 UTC" firstStartedPulling="2025-11-25 09:29:00.00775219 +0000 UTC m=+1473.210247328" lastFinishedPulling="2025-11-25 09:29:04.549424653 +0000 UTC m=+1477.751919791" observedRunningTime="2025-11-25 09:29:05.084978591 +0000 UTC m=+1478.287473729" watchObservedRunningTime="2025-11-25 09:29:05.089331347 +0000 UTC m=+1478.291826484" Nov 25 09:29:07 crc kubenswrapper[4565]: I1125 09:29:07.084499 4565 generic.go:334] "Generic (PLEG): container finished" podID="8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" containerID="3a7b58c76992e9ac43411a2c966c07e745cd234be0fa4cd96d543e695fdd3a64" exitCode=0 Nov 25 09:29:07 crc kubenswrapper[4565]: I1125 09:29:07.084598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" event={"ID":"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe","Type":"ContainerDied","Data":"3a7b58c76992e9ac43411a2c966c07e745cd234be0fa4cd96d543e695fdd3a64"} Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.450890 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.575136 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdvq\" (UniqueName: \"kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq\") pod \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.575207 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key\") pod \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.575912 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory\") pod \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\" (UID: \"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe\") " Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.583970 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq" (OuterVolumeSpecName: "kube-api-access-gmdvq") pod "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" (UID: "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe"). InnerVolumeSpecName "kube-api-access-gmdvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.606385 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory" (OuterVolumeSpecName: "inventory") pod "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" (UID: "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.611640 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" (UID: "8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.683265 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.684034 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdvq\" (UniqueName: \"kubernetes.io/projected/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-kube-api-access-gmdvq\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.684063 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.779712 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:08 crc kubenswrapper[4565]: I1125 09:29:08.779792 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.107869 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.108727 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff" event={"ID":"8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe","Type":"ContainerDied","Data":"29d967d6d3de4cba80be458e5a7974a88530e8590920a3dae9c64477a00e6617"} Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.108791 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d967d6d3de4cba80be458e5a7974a88530e8590920a3dae9c64477a00e6617" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.182943 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5"] Nov 25 09:29:09 crc kubenswrapper[4565]: E1125 09:29:09.183361 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="extract-content" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183382 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="extract-content" Nov 25 09:29:09 crc kubenswrapper[4565]: E1125 09:29:09.183396 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="registry-server" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183402 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="registry-server" Nov 25 09:29:09 crc kubenswrapper[4565]: E1125 09:29:09.183421 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="extract-utilities" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183427 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="extract-utilities" Nov 25 09:29:09 crc kubenswrapper[4565]: E1125 09:29:09.183452 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183460 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183696 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.183733 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafe6cf9-152c-4f70-871c-356528bad9a3" containerName="registry-server" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.184396 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.186064 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.186266 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.186520 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.186673 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.197275 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5"] Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.201640 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2nl\" (UniqueName: \"kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.201840 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.201919 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.304488 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2nl\" (UniqueName: \"kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.304723 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.304816 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.310225 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.310347 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.322622 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2nl\" (UniqueName: \"kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.499734 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:09 crc kubenswrapper[4565]: I1125 09:29:09.828339 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvm2d" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="registry-server" probeResult="failure" output=< Nov 25 09:29:09 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:29:09 crc kubenswrapper[4565]: > Nov 25 09:29:10 crc kubenswrapper[4565]: I1125 09:29:10.008547 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5"] Nov 25 09:29:10 crc kubenswrapper[4565]: I1125 09:29:10.117316 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" event={"ID":"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466","Type":"ContainerStarted","Data":"a3bc5c2513edcdc92d381932376ded6d5c95aee6b4276cb4fe6972c8a27be6b6"} Nov 25 09:29:12 crc kubenswrapper[4565]: I1125 09:29:12.151954 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" event={"ID":"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466","Type":"ContainerStarted","Data":"51a83edc25335af15bccc91e7ef4e9040c9264d323d232d9be0d7c6b02429ded"} Nov 25 09:29:12 crc kubenswrapper[4565]: I1125 09:29:12.170698 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" podStartSLOduration=2.214708322 podStartE2EDuration="3.170678714s" podCreationTimestamp="2025-11-25 09:29:09 +0000 UTC" firstStartedPulling="2025-11-25 09:29:10.022828931 +0000 UTC m=+1483.225324069" lastFinishedPulling="2025-11-25 09:29:10.978799324 +0000 UTC m=+1484.181294461" observedRunningTime="2025-11-25 09:29:12.168745511 +0000 UTC m=+1485.371240649" watchObservedRunningTime="2025-11-25 09:29:12.170678714 +0000 UTC m=+1485.373173853" Nov 25 09:29:14 crc kubenswrapper[4565]: I1125 09:29:14.027777 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-66vnw"] Nov 25 09:29:14 crc kubenswrapper[4565]: I1125 09:29:14.035266 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-66vnw"] Nov 25 09:29:15 crc kubenswrapper[4565]: I1125 09:29:15.109634 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b855517-9e82-425c-ab21-321b63053540" path="/var/lib/kubelet/pods/7b855517-9e82-425c-ab21-321b63053540/volumes" Nov 25 09:29:15 crc kubenswrapper[4565]: I1125 09:29:15.179560 4565 generic.go:334] "Generic (PLEG): container finished" podID="70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" containerID="51a83edc25335af15bccc91e7ef4e9040c9264d323d232d9be0d7c6b02429ded" exitCode=0 Nov 25 09:29:15 crc kubenswrapper[4565]: I1125 09:29:15.179624 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" event={"ID":"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466","Type":"ContainerDied","Data":"51a83edc25335af15bccc91e7ef4e9040c9264d323d232d9be0d7c6b02429ded"} Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.558604 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.667359 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory\") pod \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.667667 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2nl\" (UniqueName: \"kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl\") pod \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.667857 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key\") pod \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\" (UID: \"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466\") " Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.673797 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl" (OuterVolumeSpecName: "kube-api-access-2q2nl") pod "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" (UID: "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466"). InnerVolumeSpecName "kube-api-access-2q2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.696736 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory" (OuterVolumeSpecName: "inventory") pod "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" (UID: "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.697370 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" (UID: "70ac68ea-c3a1-427c-b07a-1e1e5bc6e466"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.770656 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.770693 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2nl\" (UniqueName: \"kubernetes.io/projected/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-kube-api-access-2q2nl\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:16 crc kubenswrapper[4565]: I1125 09:29:16.770707 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.199366 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" event={"ID":"70ac68ea-c3a1-427c-b07a-1e1e5bc6e466","Type":"ContainerDied","Data":"a3bc5c2513edcdc92d381932376ded6d5c95aee6b4276cb4fe6972c8a27be6b6"} Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.199584 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3bc5c2513edcdc92d381932376ded6d5c95aee6b4276cb4fe6972c8a27be6b6" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.199441 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.324215 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn"] Nov 25 09:29:17 crc kubenswrapper[4565]: E1125 09:29:17.325336 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.325367 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.325649 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.326519 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.332619 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.332883 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.333142 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.333355 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.341729 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn"] Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.383135 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp7v\" (UniqueName: \"kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.383229 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.383321 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.484523 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.484630 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp7v\" (UniqueName: \"kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.484708 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.494004 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.495331 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.507230 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp7v\" (UniqueName: \"kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sfksn\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:17 crc kubenswrapper[4565]: I1125 09:29:17.656693 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:29:18 crc kubenswrapper[4565]: I1125 09:29:18.636322 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn"] Nov 25 09:29:18 crc kubenswrapper[4565]: I1125 09:29:18.822218 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:18 crc kubenswrapper[4565]: I1125 09:29:18.868518 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:19 crc kubenswrapper[4565]: I1125 09:29:19.065136 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:29:19 crc kubenswrapper[4565]: I1125 09:29:19.224840 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" event={"ID":"d461ae1d-1d53-43af-830d-a4e301627bf9","Type":"ContainerStarted","Data":"1d5c1cf2a4d024dc67b5919c20ab9799212b0a815962321f12d4414715b1cc48"} Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.236148 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvm2d" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="registry-server" containerID="cri-o://e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5" gracePeriod=2 Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.236920 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" event={"ID":"d461ae1d-1d53-43af-830d-a4e301627bf9","Type":"ContainerStarted","Data":"50c07b19060375a1c048d913967518c2e8da6a677e127af3bd7b4138228d1660"} Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.269454 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" podStartSLOduration=2.774744373 podStartE2EDuration="3.269432248s" podCreationTimestamp="2025-11-25 09:29:17 +0000 UTC" firstStartedPulling="2025-11-25 09:29:18.645516937 +0000 UTC m=+1491.848012075" lastFinishedPulling="2025-11-25 09:29:19.140204812 +0000 UTC m=+1492.342699950" observedRunningTime="2025-11-25 09:29:20.263237711 +0000 UTC m=+1493.465732849" watchObservedRunningTime="2025-11-25 09:29:20.269432248 +0000 UTC m=+1493.471927387" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.641943 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.781266 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqkz7\" (UniqueName: \"kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7\") pod \"4829cb82-3e31-4caf-803e-1b33caf907b5\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.782086 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content\") pod \"4829cb82-3e31-4caf-803e-1b33caf907b5\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.782346 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities\") pod \"4829cb82-3e31-4caf-803e-1b33caf907b5\" (UID: \"4829cb82-3e31-4caf-803e-1b33caf907b5\") " Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.783157 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities" (OuterVolumeSpecName: "utilities") pod "4829cb82-3e31-4caf-803e-1b33caf907b5" (UID: "4829cb82-3e31-4caf-803e-1b33caf907b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.783669 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.794099 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7" (OuterVolumeSpecName: "kube-api-access-xqkz7") pod "4829cb82-3e31-4caf-803e-1b33caf907b5" (UID: "4829cb82-3e31-4caf-803e-1b33caf907b5"). InnerVolumeSpecName "kube-api-access-xqkz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.882442 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4829cb82-3e31-4caf-803e-1b33caf907b5" (UID: "4829cb82-3e31-4caf-803e-1b33caf907b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.886451 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqkz7\" (UniqueName: \"kubernetes.io/projected/4829cb82-3e31-4caf-803e-1b33caf907b5-kube-api-access-xqkz7\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:20 crc kubenswrapper[4565]: I1125 09:29:20.886496 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4829cb82-3e31-4caf-803e-1b33caf907b5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.262216 4565 generic.go:334] "Generic (PLEG): container finished" podID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerID="e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5" exitCode=0 Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.263288 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvm2d" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.263785 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerDied","Data":"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5"} Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.263822 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvm2d" event={"ID":"4829cb82-3e31-4caf-803e-1b33caf907b5","Type":"ContainerDied","Data":"8528998afe2a69316da09fdd4325b9f51ace753f5d34d9db51bb65455479b64d"} Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.263845 4565 scope.go:117] "RemoveContainer" containerID="e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.304489 4565 scope.go:117] "RemoveContainer" containerID="31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.308052 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.318740 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvm2d"] Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.333559 4565 scope.go:117] "RemoveContainer" containerID="4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.368365 4565 scope.go:117] "RemoveContainer" containerID="e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5" Nov 25 09:29:21 crc kubenswrapper[4565]: E1125 09:29:21.372973 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5\": container with ID starting with e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5 not found: ID does not exist" containerID="e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.373032 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5"} err="failed to get container status \"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5\": rpc error: code = NotFound desc = could not find container \"e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5\": container with ID starting with e67e839bffd81e23eb392f4a1fb9435cec6333c8d36b6c06ee5e949c0f49d9e5 not found: ID does not exist" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.373065 4565 scope.go:117] "RemoveContainer" containerID="31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f" Nov 25 09:29:21 crc kubenswrapper[4565]: E1125 09:29:21.374786 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f\": container with ID starting with 31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f not found: ID does not exist" containerID="31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.374893 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f"} err="failed to get container status \"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f\": rpc error: code = NotFound desc = could not find container \"31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f\": container with ID starting with 31c474878e142acec25a69e433e7a625b8bc0a57998585a4a84b05f5af826e6f not found: ID does not exist" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.374992 4565 scope.go:117] "RemoveContainer" containerID="4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9" Nov 25 09:29:21 crc kubenswrapper[4565]: E1125 09:29:21.375733 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9\": container with ID starting with 4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9 not found: ID does not exist" containerID="4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9" Nov 25 09:29:21 crc kubenswrapper[4565]: I1125 09:29:21.375816 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9"} err="failed to get container status \"4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9\": rpc error: code = NotFound desc = could not find container \"4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9\": container with ID starting with 4ed6fe80aad7a14196c486538cca0f8701ca877df1e30e9a4012c0a27488b6d9 not found: ID does not exist" Nov 25 09:29:23 crc kubenswrapper[4565]: I1125 09:29:23.109224 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" path="/var/lib/kubelet/pods/4829cb82-3e31-4caf-803e-1b33caf907b5/volumes" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.099184 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.100152 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.108864 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.109567 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.109670 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" gracePeriod=600 Nov 25 09:29:25 crc kubenswrapper[4565]: E1125 09:29:25.230025 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.304312 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" exitCode=0 Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.304395 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d"} Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.304858 4565 scope.go:117] "RemoveContainer" containerID="7ed4699c003b3641688dbcd2051893b81e4b0be01a977d95c172b92c4c0191bf" Nov 25 09:29:25 crc kubenswrapper[4565]: I1125 09:29:25.305535 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:29:25 crc kubenswrapper[4565]: E1125 09:29:25.305989 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:29:26 crc kubenswrapper[4565]: I1125 09:29:26.039556 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wz4lt"] Nov 25 09:29:26 crc kubenswrapper[4565]: I1125 09:29:26.048260 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wz4lt"] Nov 25 09:29:27 crc kubenswrapper[4565]: I1125 09:29:27.114112 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754ce7b0-5a2a-4206-a901-94010cec0e08" path="/var/lib/kubelet/pods/754ce7b0-5a2a-4206-a901-94010cec0e08/volumes" Nov 25 09:29:28 crc kubenswrapper[4565]: I1125 09:29:28.024958 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4k5dr"] Nov 25 09:29:28 crc kubenswrapper[4565]: I1125 09:29:28.032299 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4k5dr"] Nov 25 09:29:29 crc kubenswrapper[4565]: I1125 09:29:29.107160 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc7a6cc-0061-40f7-a263-0d3fc86ce86e" path="/var/lib/kubelet/pods/8dc7a6cc-0061-40f7-a263-0d3fc86ce86e/volumes" Nov 25 09:29:32 crc kubenswrapper[4565]: I1125 09:29:32.022877 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bs9qc"] Nov 25 09:29:32 crc kubenswrapper[4565]: I1125 09:29:32.028689 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bs9qc"] Nov 25 09:29:33 crc kubenswrapper[4565]: I1125 09:29:33.107978 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad94c28-1e87-4955-9d65-ca6cfdd4087f" path="/var/lib/kubelet/pods/fad94c28-1e87-4955-9d65-ca6cfdd4087f/volumes" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.694851 4565 scope.go:117] "RemoveContainer" containerID="a2c03d5a24bdffb51829758b74b615e235cc84bff56fa4b9659cd4ef32d4b727" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.721556 4565 scope.go:117] "RemoveContainer" containerID="012796fe09ecc36f52792d0175b31e322ecf7a3d334f4b8f042a6f239a937fbe" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.762234 4565 scope.go:117] "RemoveContainer" containerID="cb0fc42d94f3da9546b989d13eb383ca0590743c2fa36cc77ab49f64f5d6a8ad" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.777759 4565 scope.go:117] "RemoveContainer" containerID="21b8de1db821f19fcd7a96e05e125e48165f9c3e90de2342de80a33e5a1481aa" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.833120 4565 scope.go:117] "RemoveContainer" containerID="f31b0601f66bb2a9d97783adb1fbfcc52a5ea8dff246d30bf3c1496084e3134e" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.850942 4565 scope.go:117] "RemoveContainer" containerID="a8e2f4be35e77c4696ff5c86e735ce6fa04a8a3578040e82843b64261146dff7" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.877087 4565 scope.go:117] "RemoveContainer" containerID="536b7ef9f5243b894bb6a0f1e32b92826ef47e276d76df599efa15830e42ccde" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.908683 4565 scope.go:117] "RemoveContainer" containerID="33440ebbdd12140fa0cd8cca99e21d5931715b4988161be46d272172afd80dfd" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.949336 4565 scope.go:117] "RemoveContainer" containerID="6b5671d101ba4871ee228dd23daa78162ab92ec583a6f2f91cc5fc0c43c4c9dc" Nov 25 09:29:34 crc kubenswrapper[4565]: I1125 09:29:34.977328 4565 scope.go:117] "RemoveContainer" containerID="699588041b9fa9a82019df4339e6f4596a5d5e7c1902dfb383b2b5d2e4aa5a54" Nov 25 09:29:35 crc kubenswrapper[4565]: I1125 09:29:35.004406 4565 scope.go:117] "RemoveContainer" containerID="a5be5ca82d369b44e99466facfc5e44236be158a00d4f8b0e1ff5bb38019e1b3" Nov 25 09:29:35 crc kubenswrapper[4565]: I1125 09:29:35.029463 4565 scope.go:117] "RemoveContainer" containerID="4439f437b0343e7149841b89b99d546734bf695a5c31c28f5e85896dfcb24c89" Nov 25 09:29:38 crc kubenswrapper[4565]: I1125 09:29:38.097613 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:29:38 crc kubenswrapper[4565]: E1125 09:29:38.098577 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:29:47 crc kubenswrapper[4565]: I1125 09:29:47.035395 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9gszq"] Nov 25 09:29:47 crc kubenswrapper[4565]: I1125 09:29:47.047955 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9gszq"] Nov 25 09:29:47 crc kubenswrapper[4565]: I1125 09:29:47.106352 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e21a69-41ac-4166-bfeb-6ec0eaff7e64" path="/var/lib/kubelet/pods/33e21a69-41ac-4166-bfeb-6ec0eaff7e64/volumes" Nov 25 09:29:49 crc kubenswrapper[4565]: I1125 09:29:49.098338 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:29:49 crc kubenswrapper[4565]: E1125 09:29:49.099080 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:29:58 crc kubenswrapper[4565]: I1125 09:29:58.628282 4565 generic.go:334] "Generic (PLEG): container finished" podID="d461ae1d-1d53-43af-830d-a4e301627bf9" containerID="50c07b19060375a1c048d913967518c2e8da6a677e127af3bd7b4138228d1660" exitCode=0 Nov 25 09:29:58 crc kubenswrapper[4565]: I1125 09:29:58.628383 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" event={"ID":"d461ae1d-1d53-43af-830d-a4e301627bf9","Type":"ContainerDied","Data":"50c07b19060375a1c048d913967518c2e8da6a677e127af3bd7b4138228d1660"} Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.151721 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x"] Nov 25 09:30:00 crc kubenswrapper[4565]: E1125 09:30:00.152478 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="extract-utilities" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.152494 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="extract-utilities" Nov 25 09:30:00 crc kubenswrapper[4565]: E1125 09:30:00.152507 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="registry-server" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.152512 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="registry-server" Nov 25 09:30:00 crc kubenswrapper[4565]: E1125 09:30:00.152542 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="extract-content" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.152547 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="extract-content" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.152721 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4829cb82-3e31-4caf-803e-1b33caf907b5" containerName="registry-server" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.153538 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.158224 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.158260 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.165024 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x"] Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.190361 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.255507 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9h5\" (UniqueName: \"kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.255647 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.255813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.356453 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key\") pod \"d461ae1d-1d53-43af-830d-a4e301627bf9\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.356631 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory\") pod \"d461ae1d-1d53-43af-830d-a4e301627bf9\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.356922 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp7v\" (UniqueName: \"kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v\") pod \"d461ae1d-1d53-43af-830d-a4e301627bf9\" (UID: \"d461ae1d-1d53-43af-830d-a4e301627bf9\") " Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.357389 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.357474 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.357816 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9h5\" (UniqueName: \"kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.359044 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.369164 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v" (OuterVolumeSpecName: "kube-api-access-5dp7v") pod "d461ae1d-1d53-43af-830d-a4e301627bf9" (UID: "d461ae1d-1d53-43af-830d-a4e301627bf9"). InnerVolumeSpecName "kube-api-access-5dp7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.385175 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.385824 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9h5\" (UniqueName: \"kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5\") pod \"collect-profiles-29401050-kld9x\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.388480 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory" (OuterVolumeSpecName: "inventory") pod "d461ae1d-1d53-43af-830d-a4e301627bf9" (UID: "d461ae1d-1d53-43af-830d-a4e301627bf9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.390963 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d461ae1d-1d53-43af-830d-a4e301627bf9" (UID: "d461ae1d-1d53-43af-830d-a4e301627bf9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.460355 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp7v\" (UniqueName: \"kubernetes.io/projected/d461ae1d-1d53-43af-830d-a4e301627bf9-kube-api-access-5dp7v\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.460693 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.460709 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d461ae1d-1d53-43af-830d-a4e301627bf9-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.501263 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.656909 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" event={"ID":"d461ae1d-1d53-43af-830d-a4e301627bf9","Type":"ContainerDied","Data":"1d5c1cf2a4d024dc67b5919c20ab9799212b0a815962321f12d4414715b1cc48"} Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.656978 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5c1cf2a4d024dc67b5919c20ab9799212b0a815962321f12d4414715b1cc48" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.657059 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.757362 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gjznj"] Nov 25 09:30:00 crc kubenswrapper[4565]: E1125 09:30:00.759308 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d461ae1d-1d53-43af-830d-a4e301627bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.759337 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d461ae1d-1d53-43af-830d-a4e301627bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.759854 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d461ae1d-1d53-43af-830d-a4e301627bf9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.760898 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.763180 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.764819 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.765135 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.765505 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.771197 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gjznj"] Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.787815 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.787862 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmk2v\" (UniqueName: \"kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.787904 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.892060 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.892405 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmk2v\" (UniqueName: \"kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.892465 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.898432 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.898436 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.909620 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmk2v\" (UniqueName: \"kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v\") pod \"ssh-known-hosts-edpm-deployment-gjznj\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:00 crc kubenswrapper[4565]: I1125 09:30:00.943265 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x"] Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.080799 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.617195 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gjznj"] Nov 25 09:30:01 crc kubenswrapper[4565]: W1125 09:30:01.624621 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70daaa2e_42b4_4cf4_9b5a_cb44a6293db8.slice/crio-316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746 WatchSource:0}: Error finding container 316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746: Status 404 returned error can't find the container with id 316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746 Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.664661 4565 generic.go:334] "Generic (PLEG): container finished" podID="01f90bd7-1f21-496a-b040-3cbaed72bea6" containerID="c7f59b994cb8ed4c077d58bf2969e4b9ac349a22768851246d6ae90272168a2e" exitCode=0 Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.664725 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" event={"ID":"01f90bd7-1f21-496a-b040-3cbaed72bea6","Type":"ContainerDied","Data":"c7f59b994cb8ed4c077d58bf2969e4b9ac349a22768851246d6ae90272168a2e"} Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.664999 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" event={"ID":"01f90bd7-1f21-496a-b040-3cbaed72bea6","Type":"ContainerStarted","Data":"a81924e7e16ddbbadfe3c70f204f2f099f77ca4600ae2d96fc7d28ca4ee92b34"} Nov 25 09:30:01 crc kubenswrapper[4565]: I1125 09:30:01.667001 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" event={"ID":"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8","Type":"ContainerStarted","Data":"316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746"} Nov 25 09:30:02 crc kubenswrapper[4565]: I1125 09:30:02.676800 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" event={"ID":"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8","Type":"ContainerStarted","Data":"e888ae7606b63e7c323b04ea176dbf72d037d6fd3f710b8424aa80f24de7555d"} Nov 25 09:30:02 crc kubenswrapper[4565]: I1125 09:30:02.703464 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" podStartSLOduration=2.198097999 podStartE2EDuration="2.703444408s" podCreationTimestamp="2025-11-25 09:30:00 +0000 UTC" firstStartedPulling="2025-11-25 09:30:01.627512927 +0000 UTC m=+1534.830008065" lastFinishedPulling="2025-11-25 09:30:02.132859347 +0000 UTC m=+1535.335354474" observedRunningTime="2025-11-25 09:30:02.693515259 +0000 UTC m=+1535.896010397" watchObservedRunningTime="2025-11-25 09:30:02.703444408 +0000 UTC m=+1535.905939546" Nov 25 09:30:02 crc kubenswrapper[4565]: I1125 09:30:02.973869 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.039667 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume\") pod \"01f90bd7-1f21-496a-b040-3cbaed72bea6\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.039817 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume\") pod \"01f90bd7-1f21-496a-b040-3cbaed72bea6\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.039849 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9h5\" (UniqueName: \"kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5\") pod \"01f90bd7-1f21-496a-b040-3cbaed72bea6\" (UID: \"01f90bd7-1f21-496a-b040-3cbaed72bea6\") " Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.041609 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume" (OuterVolumeSpecName: "config-volume") pod "01f90bd7-1f21-496a-b040-3cbaed72bea6" (UID: "01f90bd7-1f21-496a-b040-3cbaed72bea6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.045406 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "01f90bd7-1f21-496a-b040-3cbaed72bea6" (UID: "01f90bd7-1f21-496a-b040-3cbaed72bea6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.045575 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5" (OuterVolumeSpecName: "kube-api-access-7f9h5") pod "01f90bd7-1f21-496a-b040-3cbaed72bea6" (UID: "01f90bd7-1f21-496a-b040-3cbaed72bea6"). InnerVolumeSpecName "kube-api-access-7f9h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.098380 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:30:03 crc kubenswrapper[4565]: E1125 09:30:03.099278 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.142500 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01f90bd7-1f21-496a-b040-3cbaed72bea6-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.142629 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01f90bd7-1f21-496a-b040-3cbaed72bea6-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.142767 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9h5\" (UniqueName: \"kubernetes.io/projected/01f90bd7-1f21-496a-b040-3cbaed72bea6-kube-api-access-7f9h5\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.697079 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.697249 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x" event={"ID":"01f90bd7-1f21-496a-b040-3cbaed72bea6","Type":"ContainerDied","Data":"a81924e7e16ddbbadfe3c70f204f2f099f77ca4600ae2d96fc7d28ca4ee92b34"} Nov 25 09:30:03 crc kubenswrapper[4565]: I1125 09:30:03.697327 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81924e7e16ddbbadfe3c70f204f2f099f77ca4600ae2d96fc7d28ca4ee92b34" Nov 25 09:30:08 crc kubenswrapper[4565]: I1125 09:30:08.745469 4565 generic.go:334] "Generic (PLEG): container finished" podID="70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" containerID="e888ae7606b63e7c323b04ea176dbf72d037d6fd3f710b8424aa80f24de7555d" exitCode=0 Nov 25 09:30:08 crc kubenswrapper[4565]: I1125 09:30:08.745566 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" event={"ID":"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8","Type":"ContainerDied","Data":"e888ae7606b63e7c323b04ea176dbf72d037d6fd3f710b8424aa80f24de7555d"} Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.083028 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.205104 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0\") pod \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.205590 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam\") pod \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.205792 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmk2v\" (UniqueName: \"kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v\") pod \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\" (UID: \"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8\") " Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.213549 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v" (OuterVolumeSpecName: "kube-api-access-kmk2v") pod "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" (UID: "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8"). InnerVolumeSpecName "kube-api-access-kmk2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.229281 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" (UID: "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.232416 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" (UID: "70daaa2e-42b4-4cf4-9b5a-cb44a6293db8"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.308848 4565 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.308886 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.308904 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmk2v\" (UniqueName: \"kubernetes.io/projected/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8-kube-api-access-kmk2v\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.776880 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" event={"ID":"70daaa2e-42b4-4cf4-9b5a-cb44a6293db8","Type":"ContainerDied","Data":"316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746"} Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.776961 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="316a68a0661127296cee70b477250b207de072cf5518e50097ec83d121ba8746" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.776994 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gjznj" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.841373 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87"] Nov 25 09:30:10 crc kubenswrapper[4565]: E1125 09:30:10.842033 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.842149 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:30:10 crc kubenswrapper[4565]: E1125 09:30:10.842207 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f90bd7-1f21-496a-b040-3cbaed72bea6" containerName="collect-profiles" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.842250 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f90bd7-1f21-496a-b040-3cbaed72bea6" containerName="collect-profiles" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.842538 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.845599 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f90bd7-1f21-496a-b040-3cbaed72bea6" containerName="collect-profiles" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.846368 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.848358 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.848695 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.849139 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.849447 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.862664 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87"] Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.922075 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrp5p\" (UniqueName: \"kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.922155 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:10 crc kubenswrapper[4565]: I1125 09:30:10.922185 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.024391 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.024803 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.025114 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrp5p\" (UniqueName: \"kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.032149 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.032289 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.041889 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrp5p\" (UniqueName: \"kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpc87\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.175116 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.660764 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87"] Nov 25 09:30:11 crc kubenswrapper[4565]: I1125 09:30:11.792716 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" event={"ID":"a799faaf-4d07-46a0-8b5b-ffa7af279ab7","Type":"ContainerStarted","Data":"74f49fdb811d6727e6af3df94ca5a4a3267e2b30641de60f118825d933d94f10"} Nov 25 09:30:12 crc kubenswrapper[4565]: I1125 09:30:12.805951 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" event={"ID":"a799faaf-4d07-46a0-8b5b-ffa7af279ab7","Type":"ContainerStarted","Data":"43c4ff9bf5adca0cb9b01593e8a485280d1f9c8e33dd39486e7624eba6119c25"} Nov 25 09:30:12 crc kubenswrapper[4565]: I1125 09:30:12.829286 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" podStartSLOduration=2.319323889 podStartE2EDuration="2.829266934s" podCreationTimestamp="2025-11-25 09:30:10 +0000 UTC" firstStartedPulling="2025-11-25 09:30:11.669788165 +0000 UTC m=+1544.872283303" lastFinishedPulling="2025-11-25 09:30:12.17973121 +0000 UTC m=+1545.382226348" observedRunningTime="2025-11-25 09:30:12.823633724 +0000 UTC m=+1546.026128862" watchObservedRunningTime="2025-11-25 09:30:12.829266934 +0000 UTC m=+1546.031762072" Nov 25 09:30:16 crc kubenswrapper[4565]: I1125 09:30:16.097394 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:30:16 crc kubenswrapper[4565]: E1125 09:30:16.098067 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:30:18 crc kubenswrapper[4565]: I1125 09:30:18.868780 4565 generic.go:334] "Generic (PLEG): container finished" podID="a799faaf-4d07-46a0-8b5b-ffa7af279ab7" containerID="43c4ff9bf5adca0cb9b01593e8a485280d1f9c8e33dd39486e7624eba6119c25" exitCode=0 Nov 25 09:30:18 crc kubenswrapper[4565]: I1125 09:30:18.868885 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" event={"ID":"a799faaf-4d07-46a0-8b5b-ffa7af279ab7","Type":"ContainerDied","Data":"43c4ff9bf5adca0cb9b01593e8a485280d1f9c8e33dd39486e7624eba6119c25"} Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.190464 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.331010 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrp5p\" (UniqueName: \"kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p\") pod \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.331404 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key\") pod \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.331451 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory\") pod \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\" (UID: \"a799faaf-4d07-46a0-8b5b-ffa7af279ab7\") " Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.337582 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p" (OuterVolumeSpecName: "kube-api-access-rrp5p") pod "a799faaf-4d07-46a0-8b5b-ffa7af279ab7" (UID: "a799faaf-4d07-46a0-8b5b-ffa7af279ab7"). InnerVolumeSpecName "kube-api-access-rrp5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.355568 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory" (OuterVolumeSpecName: "inventory") pod "a799faaf-4d07-46a0-8b5b-ffa7af279ab7" (UID: "a799faaf-4d07-46a0-8b5b-ffa7af279ab7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.357986 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a799faaf-4d07-46a0-8b5b-ffa7af279ab7" (UID: "a799faaf-4d07-46a0-8b5b-ffa7af279ab7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.438596 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrp5p\" (UniqueName: \"kubernetes.io/projected/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-kube-api-access-rrp5p\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.438738 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.438798 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a799faaf-4d07-46a0-8b5b-ffa7af279ab7-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.909961 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" event={"ID":"a799faaf-4d07-46a0-8b5b-ffa7af279ab7","Type":"ContainerDied","Data":"74f49fdb811d6727e6af3df94ca5a4a3267e2b30641de60f118825d933d94f10"} Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.910002 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74f49fdb811d6727e6af3df94ca5a4a3267e2b30641de60f118825d933d94f10" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.910056 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.946776 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9"] Nov 25 09:30:20 crc kubenswrapper[4565]: E1125 09:30:20.947317 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a799faaf-4d07-46a0-8b5b-ffa7af279ab7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.947386 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a799faaf-4d07-46a0-8b5b-ffa7af279ab7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.947630 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a799faaf-4d07-46a0-8b5b-ffa7af279ab7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.948400 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.952280 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.952522 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.953393 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.953603 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:30:20 crc kubenswrapper[4565]: I1125 09:30:20.956784 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9"] Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.053803 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.054006 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.054201 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm8g5\" (UniqueName: \"kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.156522 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.156588 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.158050 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm8g5\" (UniqueName: \"kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.165781 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.172769 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.175939 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm8g5\" (UniqueName: \"kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.268367 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.751348 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9"] Nov 25 09:30:21 crc kubenswrapper[4565]: I1125 09:30:21.921668 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" event={"ID":"68414e6d-0dc0-43ee-9273-6b4fc3e43563","Type":"ContainerStarted","Data":"7fe8375df385f5ff9791cde078693a5ca2465734d5779b7bce51002dbe8b7ca5"} Nov 25 09:30:22 crc kubenswrapper[4565]: I1125 09:30:22.937485 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" event={"ID":"68414e6d-0dc0-43ee-9273-6b4fc3e43563","Type":"ContainerStarted","Data":"088dea403f5a74872d48e5d0262b778d3613d565de11bf54a5a78afe49847ed2"} Nov 25 09:30:22 crc kubenswrapper[4565]: I1125 09:30:22.968322 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" podStartSLOduration=2.4706829790000002 podStartE2EDuration="2.968292101s" podCreationTimestamp="2025-11-25 09:30:20 +0000 UTC" firstStartedPulling="2025-11-25 09:30:21.754025152 +0000 UTC m=+1554.956520290" lastFinishedPulling="2025-11-25 09:30:22.251634274 +0000 UTC m=+1555.454129412" observedRunningTime="2025-11-25 09:30:22.956258083 +0000 UTC m=+1556.158753221" watchObservedRunningTime="2025-11-25 09:30:22.968292101 +0000 UTC m=+1556.170787239" Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.037173 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2601-account-create-pwb9l"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.068287 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6qlsb"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.087001 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4c0b-account-create-dsld7"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.096327 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8hwpf"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.111882 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zbsrv"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.118083 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6qlsb"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.123889 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d911-account-create-sd45h"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.129895 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2601-account-create-pwb9l"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.135032 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zbsrv"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.141393 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d911-account-create-sd45h"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.145604 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8hwpf"] Nov 25 09:30:23 crc kubenswrapper[4565]: I1125 09:30:23.150720 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4c0b-account-create-dsld7"] Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.108630 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645" path="/var/lib/kubelet/pods/28aa4f32-5ea5-4e25-aa7d-23b5d4ea2645/volumes" Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.109570 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39af6b73-ead9-4001-b7bb-990b384efbd6" path="/var/lib/kubelet/pods/39af6b73-ead9-4001-b7bb-990b384efbd6/volumes" Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.110169 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f7705c-306a-4c2f-b4c1-3b1617c83568" path="/var/lib/kubelet/pods/55f7705c-306a-4c2f-b4c1-3b1617c83568/volumes" Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.110741 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910d79e3-ae87-4083-ab49-d472d838cca5" path="/var/lib/kubelet/pods/910d79e3-ae87-4083-ab49-d472d838cca5/volumes" Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.111822 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c562a5bf-0aaf-4c6b-bef5-c3de522e3382" path="/var/lib/kubelet/pods/c562a5bf-0aaf-4c6b-bef5-c3de522e3382/volumes" Nov 25 09:30:25 crc kubenswrapper[4565]: I1125 09:30:25.112369 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03e89dd-5750-496e-8fed-117f36a6649b" path="/var/lib/kubelet/pods/f03e89dd-5750-496e-8fed-117f36a6649b/volumes" Nov 25 09:30:27 crc kubenswrapper[4565]: I1125 09:30:27.114520 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:30:27 crc kubenswrapper[4565]: E1125 09:30:27.115497 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:30:30 crc kubenswrapper[4565]: I1125 09:30:30.013031 4565 generic.go:334] "Generic (PLEG): container finished" podID="68414e6d-0dc0-43ee-9273-6b4fc3e43563" containerID="088dea403f5a74872d48e5d0262b778d3613d565de11bf54a5a78afe49847ed2" exitCode=0 Nov 25 09:30:30 crc kubenswrapper[4565]: I1125 09:30:30.013101 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" event={"ID":"68414e6d-0dc0-43ee-9273-6b4fc3e43563","Type":"ContainerDied","Data":"088dea403f5a74872d48e5d0262b778d3613d565de11bf54a5a78afe49847ed2"} Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.392211 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.581088 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm8g5\" (UniqueName: \"kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5\") pod \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.581496 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory\") pod \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.581649 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key\") pod \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\" (UID: \"68414e6d-0dc0-43ee-9273-6b4fc3e43563\") " Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.587362 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5" (OuterVolumeSpecName: "kube-api-access-vm8g5") pod "68414e6d-0dc0-43ee-9273-6b4fc3e43563" (UID: "68414e6d-0dc0-43ee-9273-6b4fc3e43563"). InnerVolumeSpecName "kube-api-access-vm8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.604312 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "68414e6d-0dc0-43ee-9273-6b4fc3e43563" (UID: "68414e6d-0dc0-43ee-9273-6b4fc3e43563"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.606440 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory" (OuterVolumeSpecName: "inventory") pod "68414e6d-0dc0-43ee-9273-6b4fc3e43563" (UID: "68414e6d-0dc0-43ee-9273-6b4fc3e43563"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.685489 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.685535 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm8g5\" (UniqueName: \"kubernetes.io/projected/68414e6d-0dc0-43ee-9273-6b4fc3e43563-kube-api-access-vm8g5\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:31 crc kubenswrapper[4565]: I1125 09:30:31.685568 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68414e6d-0dc0-43ee-9273-6b4fc3e43563-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:30:32 crc kubenswrapper[4565]: I1125 09:30:32.035760 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" event={"ID":"68414e6d-0dc0-43ee-9273-6b4fc3e43563","Type":"ContainerDied","Data":"7fe8375df385f5ff9791cde078693a5ca2465734d5779b7bce51002dbe8b7ca5"} Nov 25 09:30:32 crc kubenswrapper[4565]: I1125 09:30:32.036041 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe8375df385f5ff9791cde078693a5ca2465734d5779b7bce51002dbe8b7ca5" Nov 25 09:30:32 crc kubenswrapper[4565]: I1125 09:30:32.035880 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.270189 4565 scope.go:117] "RemoveContainer" containerID="f73fd81c11bc0a963f9638866fd24ace6521bf7bd2e5d35ec6c05f11252ffdc9" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.300020 4565 scope.go:117] "RemoveContainer" containerID="17b647ac503a84a0e4a2f50f2bae9096d71806f0f90aa8b93a931bdc0994d1a6" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.354475 4565 scope.go:117] "RemoveContainer" containerID="64309c9963538d1e0075da47637e79925e8504cd9622ca3e4ef4ad0cfa657368" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.394212 4565 scope.go:117] "RemoveContainer" containerID="ecb74bbb60f1cbd0cbbeb226b021ff41c8730bf7f9a589902f92c49fe9f8872e" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.448630 4565 scope.go:117] "RemoveContainer" containerID="4c76c7ef4a0312ed1b7c32f9b5033d648eb62141f925e01fbc2b28f9a9473c3c" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.477976 4565 scope.go:117] "RemoveContainer" containerID="20a67b7c8db921e0c289e276c1d116835d692e5c5a8ad314169969c24a268e42" Nov 25 09:30:35 crc kubenswrapper[4565]: I1125 09:30:35.504573 4565 scope.go:117] "RemoveContainer" containerID="64e136ed8f8c52e376ca5a0711f49bdc3e9d6f0c5911ed93b77c87ec1060650d" Nov 25 09:30:40 crc kubenswrapper[4565]: I1125 09:30:40.098896 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:30:40 crc kubenswrapper[4565]: E1125 09:30:40.099442 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:30:43 crc kubenswrapper[4565]: I1125 09:30:43.040590 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l7jv6"] Nov 25 09:30:43 crc kubenswrapper[4565]: I1125 09:30:43.052386 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l7jv6"] Nov 25 09:30:43 crc kubenswrapper[4565]: I1125 09:30:43.106409 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c75e09-f927-412d-9a45-b122c13e711d" path="/var/lib/kubelet/pods/84c75e09-f927-412d-9a45-b122c13e711d/volumes" Nov 25 09:30:51 crc kubenswrapper[4565]: I1125 09:30:51.098149 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:30:51 crc kubenswrapper[4565]: E1125 09:30:51.099332 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:31:02 crc kubenswrapper[4565]: I1125 09:31:02.097673 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:31:02 crc kubenswrapper[4565]: E1125 09:31:02.098315 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:31:03 crc kubenswrapper[4565]: I1125 09:31:03.064434 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xcthc"] Nov 25 09:31:03 crc kubenswrapper[4565]: I1125 09:31:03.065649 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xcthc"] Nov 25 09:31:03 crc kubenswrapper[4565]: I1125 09:31:03.107735 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf895328-1ce8-477a-8939-2fe0442bfdb9" path="/var/lib/kubelet/pods/cf895328-1ce8-477a-8939-2fe0442bfdb9/volumes" Nov 25 09:31:05 crc kubenswrapper[4565]: I1125 09:31:05.029213 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x72jl"] Nov 25 09:31:05 crc kubenswrapper[4565]: I1125 09:31:05.036482 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x72jl"] Nov 25 09:31:05 crc kubenswrapper[4565]: I1125 09:31:05.106169 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1097aa33-9ec1-4839-b8ea-faa176627408" path="/var/lib/kubelet/pods/1097aa33-9ec1-4839-b8ea-faa176627408/volumes" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.100999 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:14 crc kubenswrapper[4565]: E1125 09:31:14.102301 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68414e6d-0dc0-43ee-9273-6b4fc3e43563" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.102330 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="68414e6d-0dc0-43ee-9273-6b4fc3e43563" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.102576 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="68414e6d-0dc0-43ee-9273-6b4fc3e43563" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.104261 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.118304 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.282244 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.282359 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rsl\" (UniqueName: \"kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.282428 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.385022 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.385121 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rsl\" (UniqueName: \"kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.385540 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.385592 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.385959 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.404636 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rsl\" (UniqueName: \"kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl\") pod \"certified-operators-whph2\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.428422 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:14 crc kubenswrapper[4565]: I1125 09:31:14.850746 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:15 crc kubenswrapper[4565]: I1125 09:31:15.097600 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:31:15 crc kubenswrapper[4565]: E1125 09:31:15.098054 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:31:15 crc kubenswrapper[4565]: I1125 09:31:15.463272 4565 generic.go:334] "Generic (PLEG): container finished" podID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerID="aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda" exitCode=0 Nov 25 09:31:15 crc kubenswrapper[4565]: I1125 09:31:15.463338 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerDied","Data":"aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda"} Nov 25 09:31:15 crc kubenswrapper[4565]: I1125 09:31:15.463378 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerStarted","Data":"8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a"} Nov 25 09:31:16 crc kubenswrapper[4565]: I1125 09:31:16.475382 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerStarted","Data":"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca"} Nov 25 09:31:17 crc kubenswrapper[4565]: I1125 09:31:17.486523 4565 generic.go:334] "Generic (PLEG): container finished" podID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerID="bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca" exitCode=0 Nov 25 09:31:17 crc kubenswrapper[4565]: I1125 09:31:17.486870 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerDied","Data":"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca"} Nov 25 09:31:18 crc kubenswrapper[4565]: I1125 09:31:18.502069 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerStarted","Data":"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3"} Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.430111 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.430789 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.469048 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.492191 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whph2" podStartSLOduration=7.98069627 podStartE2EDuration="10.492165392s" podCreationTimestamp="2025-11-25 09:31:14 +0000 UTC" firstStartedPulling="2025-11-25 09:31:15.46518611 +0000 UTC m=+1608.667681248" lastFinishedPulling="2025-11-25 09:31:17.976655233 +0000 UTC m=+1611.179150370" observedRunningTime="2025-11-25 09:31:18.520443083 +0000 UTC m=+1611.722938222" watchObservedRunningTime="2025-11-25 09:31:24.492165392 +0000 UTC m=+1617.694660530" Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.596137 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:24 crc kubenswrapper[4565]: I1125 09:31:24.710229 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.579010 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whph2" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="registry-server" containerID="cri-o://3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3" gracePeriod=2 Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.926342 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.970777 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities\") pod \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.970869 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content\") pod \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.971039 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rsl\" (UniqueName: \"kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl\") pod \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\" (UID: \"1f7fa1c0-445d-4119-a37c-fd5192d8466b\") " Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.976776 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities" (OuterVolumeSpecName: "utilities") pod "1f7fa1c0-445d-4119-a37c-fd5192d8466b" (UID: "1f7fa1c0-445d-4119-a37c-fd5192d8466b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:31:26 crc kubenswrapper[4565]: I1125 09:31:26.986206 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl" (OuterVolumeSpecName: "kube-api-access-p9rsl") pod "1f7fa1c0-445d-4119-a37c-fd5192d8466b" (UID: "1f7fa1c0-445d-4119-a37c-fd5192d8466b"). InnerVolumeSpecName "kube-api-access-p9rsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.009943 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7fa1c0-445d-4119-a37c-fd5192d8466b" (UID: "1f7fa1c0-445d-4119-a37c-fd5192d8466b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.072814 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.072844 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7fa1c0-445d-4119-a37c-fd5192d8466b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.072865 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9rsl\" (UniqueName: \"kubernetes.io/projected/1f7fa1c0-445d-4119-a37c-fd5192d8466b-kube-api-access-p9rsl\") on node \"crc\" DevicePath \"\"" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.589552 4565 generic.go:334] "Generic (PLEG): container finished" podID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerID="3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3" exitCode=0 Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.589597 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerDied","Data":"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3"} Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.589871 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whph2" event={"ID":"1f7fa1c0-445d-4119-a37c-fd5192d8466b","Type":"ContainerDied","Data":"8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a"} Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.589896 4565 scope.go:117] "RemoveContainer" containerID="3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.589680 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whph2" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.615707 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.622404 4565 scope.go:117] "RemoveContainer" containerID="bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.623282 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whph2"] Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.646792 4565 scope.go:117] "RemoveContainer" containerID="aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.671178 4565 scope.go:117] "RemoveContainer" containerID="3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3" Nov 25 09:31:27 crc kubenswrapper[4565]: E1125 09:31:27.671674 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3\": container with ID starting with 3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3 not found: ID does not exist" containerID="3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.671711 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3"} err="failed to get container status \"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3\": rpc error: code = NotFound desc = could not find container \"3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3\": container with ID starting with 3f33ce9c02751b91733d0af62139827d4a3a63a9584032e34d7ef24e1be34fb3 not found: ID does not exist" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.671732 4565 scope.go:117] "RemoveContainer" containerID="bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca" Nov 25 09:31:27 crc kubenswrapper[4565]: E1125 09:31:27.672031 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca\": container with ID starting with bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca not found: ID does not exist" containerID="bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.672125 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca"} err="failed to get container status \"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca\": rpc error: code = NotFound desc = could not find container \"bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca\": container with ID starting with bf75a132216cdff3c9b5a103cde01455df57557c16e7931439e5a3429c4d76ca not found: ID does not exist" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.672211 4565 scope.go:117] "RemoveContainer" containerID="aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda" Nov 25 09:31:27 crc kubenswrapper[4565]: E1125 09:31:27.672647 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda\": container with ID starting with aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda not found: ID does not exist" containerID="aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda" Nov 25 09:31:27 crc kubenswrapper[4565]: I1125 09:31:27.672782 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda"} err="failed to get container status \"aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda\": rpc error: code = NotFound desc = could not find container \"aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda\": container with ID starting with aeb05436f7ade439b114df33e5efcade052e5fb6fcf092878678f9f416f7dcda not found: ID does not exist" Nov 25 09:31:28 crc kubenswrapper[4565]: I1125 09:31:28.097309 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:31:28 crc kubenswrapper[4565]: E1125 09:31:28.097830 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:31:29 crc kubenswrapper[4565]: I1125 09:31:29.109725 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" path="/var/lib/kubelet/pods/1f7fa1c0-445d-4119-a37c-fd5192d8466b/volumes" Nov 25 09:31:34 crc kubenswrapper[4565]: E1125 09:31:34.193108 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache]" Nov 25 09:31:35 crc kubenswrapper[4565]: I1125 09:31:35.620406 4565 scope.go:117] "RemoveContainer" containerID="f93849e3715173c62cb1defb1877e385c23390def24eda663a153541e78ee958" Nov 25 09:31:35 crc kubenswrapper[4565]: I1125 09:31:35.675117 4565 scope.go:117] "RemoveContainer" containerID="96bd0e1c1f9ef62f5704df271e4e6b63d0f6f070498795bface69ad77120e221" Nov 25 09:31:35 crc kubenswrapper[4565]: I1125 09:31:35.715804 4565 scope.go:117] "RemoveContainer" containerID="7ae64e00d915cf79cc448d39974b79b42a1886dc7a0b39e952f38854e3b715ac" Nov 25 09:31:43 crc kubenswrapper[4565]: I1125 09:31:43.097621 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:31:43 crc kubenswrapper[4565]: E1125 09:31:43.098261 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:31:44 crc kubenswrapper[4565]: E1125 09:31:44.394147 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache]" Nov 25 09:31:50 crc kubenswrapper[4565]: I1125 09:31:50.035944 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-khsf2"] Nov 25 09:31:50 crc kubenswrapper[4565]: I1125 09:31:50.041848 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-khsf2"] Nov 25 09:31:51 crc kubenswrapper[4565]: I1125 09:31:51.107460 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53b2bc5-cd7f-4a87-878c-ca9deec24f8b" path="/var/lib/kubelet/pods/c53b2bc5-cd7f-4a87-878c-ca9deec24f8b/volumes" Nov 25 09:31:54 crc kubenswrapper[4565]: E1125 09:31:54.616231 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache]" Nov 25 09:31:58 crc kubenswrapper[4565]: I1125 09:31:58.097647 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:31:58 crc kubenswrapper[4565]: E1125 09:31:58.098656 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:32:04 crc kubenswrapper[4565]: E1125 09:32:04.841146 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:32:12 crc kubenswrapper[4565]: I1125 09:32:12.096811 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:32:12 crc kubenswrapper[4565]: E1125 09:32:12.097385 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.442674 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 09:32:13 crc kubenswrapper[4565]: E1125 09:32:13.443695 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="extract-utilities" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.443716 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="extract-utilities" Nov 25 09:32:13 crc kubenswrapper[4565]: E1125 09:32:13.443757 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="extract-content" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.443782 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="extract-content" Nov 25 09:32:13 crc kubenswrapper[4565]: E1125 09:32:13.443792 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="registry-server" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.443800 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="registry-server" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.444206 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7fa1c0-445d-4119-a37c-fd5192d8466b" containerName="registry-server" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.445146 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.447013 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.448048 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.448574 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.530006 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.530112 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.632547 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.632630 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.632724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.656469 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:13 crc kubenswrapper[4565]: I1125 09:32:13.773009 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:14 crc kubenswrapper[4565]: I1125 09:32:14.213291 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 09:32:15 crc kubenswrapper[4565]: E1125 09:32:15.055035 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:32:15 crc kubenswrapper[4565]: I1125 09:32:15.055296 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"550a1232-b34b-4ad4-9b50-8abbd36d9e98","Type":"ContainerStarted","Data":"3a9b7fb58f2c584af7aa28f1e7c36a324660fc215a6f4d691eb4bd74122d8136"} Nov 25 09:32:15 crc kubenswrapper[4565]: I1125 09:32:15.057377 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"550a1232-b34b-4ad4-9b50-8abbd36d9e98","Type":"ContainerStarted","Data":"2780188d78357c1c6030d07ada7a039b893dd4ac5d8710b4bdb3d80ee08ad7ca"} Nov 25 09:32:15 crc kubenswrapper[4565]: I1125 09:32:15.085596 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.085567846 podStartE2EDuration="2.085567846s" podCreationTimestamp="2025-11-25 09:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:32:15.074954969 +0000 UTC m=+1668.277450107" watchObservedRunningTime="2025-11-25 09:32:15.085567846 +0000 UTC m=+1668.288062974" Nov 25 09:32:16 crc kubenswrapper[4565]: I1125 09:32:16.066390 4565 generic.go:334] "Generic (PLEG): container finished" podID="550a1232-b34b-4ad4-9b50-8abbd36d9e98" containerID="3a9b7fb58f2c584af7aa28f1e7c36a324660fc215a6f4d691eb4bd74122d8136" exitCode=0 Nov 25 09:32:16 crc kubenswrapper[4565]: I1125 09:32:16.066675 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"550a1232-b34b-4ad4-9b50-8abbd36d9e98","Type":"ContainerDied","Data":"3a9b7fb58f2c584af7aa28f1e7c36a324660fc215a6f4d691eb4bd74122d8136"} Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.351012 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.411622 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir\") pod \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.411775 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "550a1232-b34b-4ad4-9b50-8abbd36d9e98" (UID: "550a1232-b34b-4ad4-9b50-8abbd36d9e98"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.411960 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access\") pod \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\" (UID: \"550a1232-b34b-4ad4-9b50-8abbd36d9e98\") " Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.413010 4565 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.417073 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "550a1232-b34b-4ad4-9b50-8abbd36d9e98" (UID: "550a1232-b34b-4ad4-9b50-8abbd36d9e98"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:32:17 crc kubenswrapper[4565]: I1125 09:32:17.515919 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550a1232-b34b-4ad4-9b50-8abbd36d9e98-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:32:18 crc kubenswrapper[4565]: I1125 09:32:18.089215 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"550a1232-b34b-4ad4-9b50-8abbd36d9e98","Type":"ContainerDied","Data":"2780188d78357c1c6030d07ada7a039b893dd4ac5d8710b4bdb3d80ee08ad7ca"} Nov 25 09:32:18 crc kubenswrapper[4565]: I1125 09:32:18.089270 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2780188d78357c1c6030d07ada7a039b893dd4ac5d8710b4bdb3d80ee08ad7ca" Nov 25 09:32:18 crc kubenswrapper[4565]: I1125 09:32:18.089292 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.634564 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 09:32:20 crc kubenswrapper[4565]: E1125 09:32:20.635888 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550a1232-b34b-4ad4-9b50-8abbd36d9e98" containerName="pruner" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.635922 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="550a1232-b34b-4ad4-9b50-8abbd36d9e98" containerName="pruner" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.636191 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="550a1232-b34b-4ad4-9b50-8abbd36d9e98" containerName="pruner" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.636861 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.639553 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.639642 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.644665 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.675981 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.676039 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.676097 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.778214 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.778324 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.778560 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.778620 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.778701 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.800642 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:20 crc kubenswrapper[4565]: I1125 09:32:20.956044 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:32:21 crc kubenswrapper[4565]: I1125 09:32:21.360887 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 09:32:22 crc kubenswrapper[4565]: I1125 09:32:22.136084 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae3fe2ce-4759-4160-b326-4f1e15679e96","Type":"ContainerStarted","Data":"ef01799540f1ec13ed775935859babbc343f1551ac21e87e869d2c23d8c965d3"} Nov 25 09:32:22 crc kubenswrapper[4565]: I1125 09:32:22.136583 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae3fe2ce-4759-4160-b326-4f1e15679e96","Type":"ContainerStarted","Data":"87dbb4c1846601277e7c0bd3ec5283f58bd4816f8da5a0a2dcaee7c62d0d110a"} Nov 25 09:32:22 crc kubenswrapper[4565]: I1125 09:32:22.161745 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.16171612 podStartE2EDuration="2.16171612s" podCreationTimestamp="2025-11-25 09:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:32:22.154393816 +0000 UTC m=+1675.356888954" watchObservedRunningTime="2025-11-25 09:32:22.16171612 +0000 UTC m=+1675.364211258" Nov 25 09:32:25 crc kubenswrapper[4565]: E1125 09:32:25.311345 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice/crio-8586ddc1d438bbd40b586c9f98b8d0579924b1578a3cf3b3a3b1685867ed940a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7fa1c0_445d_4119_a37c_fd5192d8466b.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:32:26 crc kubenswrapper[4565]: I1125 09:32:26.096821 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:32:26 crc kubenswrapper[4565]: E1125 09:32:26.097633 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:32:35 crc kubenswrapper[4565]: I1125 09:32:35.801416 4565 scope.go:117] "RemoveContainer" containerID="b4cc5ccc2dcf5d502d78a9e42f5309ed5a97af0358538892d23e6540da4b7e1b" Nov 25 09:32:40 crc kubenswrapper[4565]: I1125 09:32:40.096755 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:32:40 crc kubenswrapper[4565]: E1125 09:32:40.097542 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:32:55 crc kubenswrapper[4565]: I1125 09:32:55.097863 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:32:55 crc kubenswrapper[4565]: E1125 09:32:55.099450 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.407224 4565 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.409724 4565 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.409765 4565 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410470 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c" gracePeriod=15 Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410534 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410578 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410609 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410619 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410620 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419" gracePeriod=15 Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410649 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410658 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410606 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad" gracePeriod=15 Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410675 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e" gracePeriod=15 Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410727 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f" gracePeriod=15 Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410669 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410915 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410956 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410965 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.410985 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.410992 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411149 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411286 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411305 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411326 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411347 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411356 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.411616 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411629 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.411849 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.422318 4565 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.471229 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.472913 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.472971 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473001 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473021 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473047 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473089 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473109 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.473129 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584354 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584418 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584447 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584482 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584477 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584552 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584587 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584606 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584628 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584677 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584705 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584751 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.584969 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.585144 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.585171 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.585215 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: I1125 09:32:59.762012 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:32:59 crc kubenswrapper[4565]: E1125 09:32:59.800911 4565 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b361d994e2e85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 09:32:59.799670405 +0000 UTC m=+1713.002165544,LastTimestamp:2025-11-25 09:32:59.799670405 +0000 UTC m=+1713.002165544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.509821 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"db447d0f0f9f9858e01a52d6842986ff54040a5bc14d30ef9a5749f3f51caf39"} Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.510252 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"319441a64fb18be1575a2b35a79cd472b7f5f3c7df02c7c35f4755fa4ea15107"} Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.510726 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.513430 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.515088 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.516700 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad" exitCode=0 Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.516739 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419" exitCode=0 Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.516753 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e" exitCode=0 Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.516763 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f" exitCode=2 Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.516832 4565 scope.go:117] "RemoveContainer" containerID="de56491da107903bacfd641845e1a3a6c5525d863bb76dbd733ee13d3a5ca1a7" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.519216 4565 generic.go:334] "Generic (PLEG): container finished" podID="ae3fe2ce-4759-4160-b326-4f1e15679e96" containerID="ef01799540f1ec13ed775935859babbc343f1551ac21e87e869d2c23d8c965d3" exitCode=0 Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.519250 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae3fe2ce-4759-4160-b326-4f1e15679e96","Type":"ContainerDied","Data":"ef01799540f1ec13ed775935859babbc343f1551ac21e87e869d2c23d8c965d3"} Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.520115 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:00 crc kubenswrapper[4565]: I1125 09:33:00.520646 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:01 crc kubenswrapper[4565]: I1125 09:33:01.373096 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 09:33:01 crc kubenswrapper[4565]: I1125 09:33:01.536135 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.021453 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.022539 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.022922 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.029514 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.030442 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.031130 4565 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.031471 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.031809 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137316 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137400 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock\") pod \"ae3fe2ce-4759-4160-b326-4f1e15679e96\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137462 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137485 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137473 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137513 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir\") pod \"ae3fe2ce-4759-4160-b326-4f1e15679e96\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137526 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock" (OuterVolumeSpecName: "var-lock") pod "ae3fe2ce-4759-4160-b326-4f1e15679e96" (UID: "ae3fe2ce-4759-4160-b326-4f1e15679e96"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137551 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access\") pod \"ae3fe2ce-4759-4160-b326-4f1e15679e96\" (UID: \"ae3fe2ce-4759-4160-b326-4f1e15679e96\") " Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137551 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137577 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ae3fe2ce-4759-4160-b326-4f1e15679e96" (UID: "ae3fe2ce-4759-4160-b326-4f1e15679e96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137576 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.137912 4565 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.138001 4565 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.138013 4565 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.138023 4565 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.138031 4565 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3fe2ce-4759-4160-b326-4f1e15679e96-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.143475 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ae3fe2ce-4759-4160-b326-4f1e15679e96" (UID: "ae3fe2ce-4759-4160-b326-4f1e15679e96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.240355 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3fe2ce-4759-4160-b326-4f1e15679e96-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.550367 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.551411 4565 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c" exitCode=0 Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.551533 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.551550 4565 scope.go:117] "RemoveContainer" containerID="304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.554572 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae3fe2ce-4759-4160-b326-4f1e15679e96","Type":"ContainerDied","Data":"87dbb4c1846601277e7c0bd3ec5283f58bd4816f8da5a0a2dcaee7c62d0d110a"} Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.554615 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87dbb4c1846601277e7c0bd3ec5283f58bd4816f8da5a0a2dcaee7c62d0d110a" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.554634 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.565821 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.566271 4565 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.567155 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.582858 4565 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.583974 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.584291 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.588172 4565 scope.go:117] "RemoveContainer" containerID="f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.614216 4565 scope.go:117] "RemoveContainer" containerID="17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.636312 4565 scope.go:117] "RemoveContainer" containerID="3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.658676 4565 scope.go:117] "RemoveContainer" containerID="004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.693417 4565 scope.go:117] "RemoveContainer" containerID="32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.734420 4565 scope.go:117] "RemoveContainer" containerID="304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.734853 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\": container with ID starting with 304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad not found: ID does not exist" containerID="304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.734880 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad"} err="failed to get container status \"304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\": rpc error: code = NotFound desc = could not find container \"304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad\": container with ID starting with 304d2aefa5ab2ee74cbd4d0e79774ba5eef732e1d7c4ec5f22ce86a2a8f9aaad not found: ID does not exist" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.734905 4565 scope.go:117] "RemoveContainer" containerID="f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.736151 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\": container with ID starting with f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419 not found: ID does not exist" containerID="f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.736194 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419"} err="failed to get container status \"f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\": rpc error: code = NotFound desc = could not find container \"f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419\": container with ID starting with f8fcbedc6bbc9c0b1c44b9c3ee0d747b4f56028425cdae5d4f12ff5992cd3419 not found: ID does not exist" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.736224 4565 scope.go:117] "RemoveContainer" containerID="17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.737116 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\": container with ID starting with 17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e not found: ID does not exist" containerID="17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.737143 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e"} err="failed to get container status \"17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\": rpc error: code = NotFound desc = could not find container \"17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e\": container with ID starting with 17cdf1997acb00b87715600c58e9f962891c43b54f927ad27cd513cfdeafb96e not found: ID does not exist" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.737203 4565 scope.go:117] "RemoveContainer" containerID="3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.737592 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\": container with ID starting with 3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f not found: ID does not exist" containerID="3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.737636 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f"} err="failed to get container status \"3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\": rpc error: code = NotFound desc = could not find container \"3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f\": container with ID starting with 3c2772ff6af04e4f33c8f33599d64a03dba7cc98918c43aee6ea8bb252c4272f not found: ID does not exist" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.737669 4565 scope.go:117] "RemoveContainer" containerID="004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.738050 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\": container with ID starting with 004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c not found: ID does not exist" containerID="004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.738079 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c"} err="failed to get container status \"004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\": rpc error: code = NotFound desc = could not find container \"004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c\": container with ID starting with 004222a349327e4075f8fd83d6a77df2f9e99c5902645e6db91b3711bed7701c not found: ID does not exist" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.738100 4565 scope.go:117] "RemoveContainer" containerID="32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc" Nov 25 09:33:02 crc kubenswrapper[4565]: E1125 09:33:02.738357 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\": container with ID starting with 32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc not found: ID does not exist" containerID="32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc" Nov 25 09:33:02 crc kubenswrapper[4565]: I1125 09:33:02.738381 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc"} err="failed to get container status \"32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\": rpc error: code = NotFound desc = could not find container \"32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc\": container with ID starting with 32043ed77815bdd26c415b41a6ef55f4e7ecc9a7b51f5d619e6d9d13affb3dfc not found: ID does not exist" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.043514 4565 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.044011 4565 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.044846 4565 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.045124 4565 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.047274 4565 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:03 crc kubenswrapper[4565]: I1125 09:33:03.048082 4565 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.049377 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="200ms" Nov 25 09:33:03 crc kubenswrapper[4565]: I1125 09:33:03.116643 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.252000 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="400ms" Nov 25 09:33:03 crc kubenswrapper[4565]: E1125 09:33:03.652788 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="800ms" Nov 25 09:33:04 crc kubenswrapper[4565]: E1125 09:33:04.454217 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="1.6s" Nov 25 09:33:06 crc kubenswrapper[4565]: E1125 09:33:06.054573 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="3.2s" Nov 25 09:33:07 crc kubenswrapper[4565]: I1125 09:33:07.103652 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:07 crc kubenswrapper[4565]: I1125 09:33:07.104264 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:07 crc kubenswrapper[4565]: E1125 09:33:07.741969 4565 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b361d994e2e85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 09:32:59.799670405 +0000 UTC m=+1713.002165544,LastTimestamp:2025-11-25 09:32:59.799670405 +0000 UTC m=+1713.002165544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 09:33:09 crc kubenswrapper[4565]: E1125 09:33:09.256158 4565 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.129:6443: connect: connection refused" interval="6.4s" Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.098332 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:33:10 crc kubenswrapper[4565]: E1125 09:33:10.099342 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.631694 4565 generic.go:334] "Generic (PLEG): container finished" podID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" containerID="7030d146b2f7fe1b4e54b3787a2f3e7159659f8c7e0f85a057e86ee80e9a6ff9" exitCode=1 Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.631765 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerDied","Data":"7030d146b2f7fe1b4e54b3787a2f3e7159659f8c7e0f85a057e86ee80e9a6ff9"} Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.632631 4565 scope.go:117] "RemoveContainer" containerID="7030d146b2f7fe1b4e54b3787a2f3e7159659f8c7e0f85a057e86ee80e9a6ff9" Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.632636 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.633110 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:10 crc kubenswrapper[4565]: I1125 09:33:10.633480 4565 status_manager.go:851] "Failed to get status for pod" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-74454849f9-fjwfp\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.372359 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.643866 4565 generic.go:334] "Generic (PLEG): container finished" podID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" containerID="d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26" exitCode=1 Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.643948 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerDied","Data":"d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26"} Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.644004 4565 scope.go:117] "RemoveContainer" containerID="7030d146b2f7fe1b4e54b3787a2f3e7159659f8c7e0f85a057e86ee80e9a6ff9" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.645109 4565 scope.go:117] "RemoveContainer" containerID="d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.645324 4565 status_manager.go:851] "Failed to get status for pod" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-74454849f9-fjwfp\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:11 crc kubenswrapper[4565]: E1125 09:33:11.645472 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-74454849f9-fjwfp_metallb-system(145e5d59-fd78-4bc1-a97c-17ebf0d67fa4)\"" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.645597 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:11 crc kubenswrapper[4565]: I1125 09:33:11.646317 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.097253 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.098708 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.099110 4565 status_manager.go:851] "Failed to get status for pod" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-74454849f9-fjwfp\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.099546 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.114556 4565 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.114590 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:13 crc kubenswrapper[4565]: E1125 09:33:13.115091 4565 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.115782 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:13 crc kubenswrapper[4565]: W1125 09:33:13.155339 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8f805e6c1f0663c9757be56f54747e45d25cc9d79e280bd6ac2e2edc15e0e18b WatchSource:0}: Error finding container 8f805e6c1f0663c9757be56f54747e45d25cc9d79e280bd6ac2e2edc15e0e18b: Status 404 returned error can't find the container with id 8f805e6c1f0663c9757be56f54747e45d25cc9d79e280bd6ac2e2edc15e0e18b Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.662309 4565 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3589210257e3956748cd78dcb7e271edcf3e7e83cd4977289fe7f50a7f23ed46" exitCode=0 Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.662355 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3589210257e3956748cd78dcb7e271edcf3e7e83cd4977289fe7f50a7f23ed46"} Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.662401 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f805e6c1f0663c9757be56f54747e45d25cc9d79e280bd6ac2e2edc15e0e18b"} Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.662676 4565 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.662693 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:13 crc kubenswrapper[4565]: E1125 09:33:13.663107 4565 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.663196 4565 status_manager.go:851] "Failed to get status for pod" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-74454849f9-fjwfp\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.663575 4565 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:13 crc kubenswrapper[4565]: I1125 09:33:13.663880 4565 status_manager.go:851] "Failed to get status for pod" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.129:6443: connect: connection refused" Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.676589 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.677195 4565 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb" exitCode=1 Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.677268 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb"} Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.678310 4565 scope.go:117] "RemoveContainer" containerID="67b41f42fe377e5199fb9477b9f9fa788639541fce9b3f92247e1dc50a5512fb" Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.686011 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3f685fcc8396d64b995c04d454e637e1a6293575c82eccc037100683ae9e2104"} Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.686052 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f14d92b8aef4aefa5b474267a5f75335e47aded83eddf33da477a0e64b239496"} Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.686065 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"969dff8b17b28292bca054a95398782c4f9b3888446a696044e84d5bb3273234"} Nov 25 09:33:14 crc kubenswrapper[4565]: I1125 09:33:14.686075 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6ced00ef24706ebfb708b5790a9fe6374c2e8ed55bc7802bb0e0d863e9632a37"} Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.696383 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.696486 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c16d348824e47987ba4c9fe2bbc7435fafbc6064a68f9df9f4baa1d24310fa5"} Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.700902 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8acc61f07a77a3381f4e2ad3bf6ad1d595f77c01b0f2baf15d33eedbb9eccc45"} Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.701227 4565 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.701252 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:15 crc kubenswrapper[4565]: I1125 09:33:15.701413 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:16 crc kubenswrapper[4565]: I1125 09:33:16.099505 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:33:16 crc kubenswrapper[4565]: I1125 09:33:16.661701 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:33:16 crc kubenswrapper[4565]: I1125 09:33:16.662450 4565 scope.go:117] "RemoveContainer" containerID="d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26" Nov 25 09:33:16 crc kubenswrapper[4565]: E1125 09:33:16.662759 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-74454849f9-fjwfp_metallb-system(145e5d59-fd78-4bc1-a97c-17ebf0d67fa4)\"" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" Nov 25 09:33:17 crc kubenswrapper[4565]: I1125 09:33:17.596975 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:33:17 crc kubenswrapper[4565]: I1125 09:33:17.601226 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:33:18 crc kubenswrapper[4565]: I1125 09:33:18.116642 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:18 crc kubenswrapper[4565]: I1125 09:33:18.116872 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:18 crc kubenswrapper[4565]: I1125 09:33:18.121459 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.738902 4565 generic.go:334] "Generic (PLEG): container finished" podID="cbdce822-eeeb-448b-9f3b-46fdf9e9b43d" containerID="4ae787c992f422bad5650db2f5c33027fd219d2cd3dc140feb4b9b01f2905d86" exitCode=1 Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.738999 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" event={"ID":"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d","Type":"ContainerDied","Data":"4ae787c992f422bad5650db2f5c33027fd219d2cd3dc140feb4b9b01f2905d86"} Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.740496 4565 scope.go:117] "RemoveContainer" containerID="4ae787c992f422bad5650db2f5c33027fd219d2cd3dc140feb4b9b01f2905d86" Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.745052 4565 generic.go:334] "Generic (PLEG): container finished" podID="6279e5b8-cc23-4b43-9554-754a61174bcd" containerID="4e9d4c6939b486ac03437f55781802f96efce947532f9a9dce882321a0f36c48" exitCode=1 Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.745097 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerDied","Data":"4e9d4c6939b486ac03437f55781802f96efce947532f9a9dce882321a0f36c48"} Nov 25 09:33:19 crc kubenswrapper[4565]: I1125 09:33:19.745691 4565 scope.go:117] "RemoveContainer" containerID="4e9d4c6939b486ac03437f55781802f96efce947532f9a9dce882321a0f36c48" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.711823 4565 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.778762 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" event={"ID":"cbdce822-eeeb-448b-9f3b-46fdf9e9b43d","Type":"ContainerStarted","Data":"830e11c16b9ea9edf74f478d2b64f28df93287c91f3e8651295a09e17b922c31"} Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.780522 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.785436 4565 generic.go:334] "Generic (PLEG): container finished" podID="052c7786-4d54-4af0-8598-91ff09cdf966" containerID="166feafff94ed9998cc225abbaa23b51797abc515f2146e5879179a9ba8ad307" exitCode=1 Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.785498 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerDied","Data":"166feafff94ed9998cc225abbaa23b51797abc515f2146e5879179a9ba8ad307"} Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.786180 4565 scope.go:117] "RemoveContainer" containerID="166feafff94ed9998cc225abbaa23b51797abc515f2146e5879179a9ba8ad307" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.789856 4565 generic.go:334] "Generic (PLEG): container finished" podID="6279e5b8-cc23-4b43-9554-754a61174bcd" containerID="fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b" exitCode=1 Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.789913 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerDied","Data":"fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b"} Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.789991 4565 scope.go:117] "RemoveContainer" containerID="4e9d4c6939b486ac03437f55781802f96efce947532f9a9dce882321a0f36c48" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.790289 4565 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.790310 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.790874 4565 scope.go:117] "RemoveContainer" containerID="fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b" Nov 25 09:33:20 crc kubenswrapper[4565]: E1125 09:33:20.791584 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-hrr6t_openstack-operators(6279e5b8-cc23-4b43-9554-754a61174bcd)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podUID="6279e5b8-cc23-4b43-9554-754a61174bcd" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.797472 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:20 crc kubenswrapper[4565]: I1125 09:33:20.799998 4565 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4b374521-d202-4b62-b3ae-ee994800c958" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.374195 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.374500 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.375107 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-state-metrics" containerStatusID={"Type":"cri-o","ID":"ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8"} pod="openstack/kube-state-metrics-0" containerMessage="Container kube-state-metrics failed liveness probe, will be restarted" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.375148 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" containerName="kube-state-metrics" containerID="cri-o://ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8" gracePeriod=30 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.782908 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" podUID="d4a03edc-1b0f-4f50-bab7-b2292c453f4d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/healthz\": dial tcp 10.217.0.81:8081: connect: connection refused" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.785369 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" podUID="d4a03edc-1b0f-4f50-bab7-b2292c453f4d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/readyz\": dial tcp 10.217.0.81:8081: connect: connection refused" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.800234 4565 generic.go:334] "Generic (PLEG): container finished" podID="5275621d-5c51-4586-85f2-e0e24cb32266" containerID="ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8" exitCode=2 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.800313 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerDied","Data":"ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.803087 4565 generic.go:334] "Generic (PLEG): container finished" podID="052c7786-4d54-4af0-8598-91ff09cdf966" containerID="93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.803152 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerDied","Data":"93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.803472 4565 scope.go:117] "RemoveContainer" containerID="166feafff94ed9998cc225abbaa23b51797abc515f2146e5879179a9ba8ad307" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.804326 4565 scope.go:117] "RemoveContainer" containerID="93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50" Nov 25 09:33:21 crc kubenswrapper[4565]: E1125 09:33:21.804915 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-zz6wf_openstack-operators(052c7786-4d54-4af0-8598-91ff09cdf966)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podUID="052c7786-4d54-4af0-8598-91ff09cdf966" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.807749 4565 generic.go:334] "Generic (PLEG): container finished" podID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" containerID="a12528fcdeb1de8e7dfa385a1d981fd380ed4e4bf30f3a9acc1ccde7d2b796b8" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.807874 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerDied","Data":"a12528fcdeb1de8e7dfa385a1d981fd380ed4e4bf30f3a9acc1ccde7d2b796b8"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.808381 4565 scope.go:117] "RemoveContainer" containerID="a12528fcdeb1de8e7dfa385a1d981fd380ed4e4bf30f3a9acc1ccde7d2b796b8" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.838955 4565 generic.go:334] "Generic (PLEG): container finished" podID="f35f4446-328e-40d3-96d6-2bc814fb8a96" containerID="cafda87c4a914bf7e4e5b3b61c210729540c87f72d5956d1d530ff48cab41135" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.839031 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerDied","Data":"cafda87c4a914bf7e4e5b3b61c210729540c87f72d5956d1d530ff48cab41135"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.839766 4565 scope.go:117] "RemoveContainer" containerID="cafda87c4a914bf7e4e5b3b61c210729540c87f72d5956d1d530ff48cab41135" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.849000 4565 generic.go:334] "Generic (PLEG): container finished" podID="0c32d371-4207-4e71-8031-a27b6562f9a2" containerID="00b03beda1dbb5ff9b2b8b4e22cfa4e7e5a4452c957cb601e4d11b54b326fd52" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.849034 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" event={"ID":"0c32d371-4207-4e71-8031-a27b6562f9a2","Type":"ContainerDied","Data":"00b03beda1dbb5ff9b2b8b4e22cfa4e7e5a4452c957cb601e4d11b54b326fd52"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.849530 4565 scope.go:117] "RemoveContainer" containerID="00b03beda1dbb5ff9b2b8b4e22cfa4e7e5a4452c957cb601e4d11b54b326fd52" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.860074 4565 generic.go:334] "Generic (PLEG): container finished" podID="d4a03edc-1b0f-4f50-bab7-b2292c453f4d" containerID="230df4302bf427ac98820ee312d238563af7d8491eff5c4b798ff05a82aa6f3f" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.860130 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" event={"ID":"d4a03edc-1b0f-4f50-bab7-b2292c453f4d","Type":"ContainerDied","Data":"230df4302bf427ac98820ee312d238563af7d8491eff5c4b798ff05a82aa6f3f"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.860541 4565 scope.go:117] "RemoveContainer" containerID="230df4302bf427ac98820ee312d238563af7d8491eff5c4b798ff05a82aa6f3f" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.863089 4565 generic.go:334] "Generic (PLEG): container finished" podID="3791b99a-d877-470f-8a8f-56f7b02be997" containerID="1e376211332e007ac9c6b0e3d3f6d9972a045be74d6395f12a7b6c09deac8e26" exitCode=1 Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.863285 4565 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.863302 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93d20159-72d2-4207-9884-03b4ea42de14" Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.863428 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerDied","Data":"1e376211332e007ac9c6b0e3d3f6d9972a045be74d6395f12a7b6c09deac8e26"} Nov 25 09:33:21 crc kubenswrapper[4565]: I1125 09:33:21.863717 4565 scope.go:117] "RemoveContainer" containerID="1e376211332e007ac9c6b0e3d3f6d9972a045be74d6395f12a7b6c09deac8e26" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.097035 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:33:22 crc kubenswrapper[4565]: E1125 09:33:22.097425 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.878347 4565 generic.go:334] "Generic (PLEG): container finished" podID="5275621d-5c51-4586-85f2-e0e24cb32266" containerID="b5592a5a7016586fd12310dd71ccecdf223089a0e072c008af0bdc1e65ad3008" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.878455 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerDied","Data":"b5592a5a7016586fd12310dd71ccecdf223089a0e072c008af0bdc1e65ad3008"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.878884 4565 scope.go:117] "RemoveContainer" containerID="ce5253aa6f321a5ef63976ba91e917dee7527c7db0254dce53a3e831d86f0ba8" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.879527 4565 scope.go:117] "RemoveContainer" containerID="b5592a5a7016586fd12310dd71ccecdf223089a0e072c008af0bdc1e65ad3008" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.887948 4565 generic.go:334] "Generic (PLEG): container finished" podID="f35f4446-328e-40d3-96d6-2bc814fb8a96" containerID="be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.888044 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerDied","Data":"be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.889028 4565 scope.go:117] "RemoveContainer" containerID="be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c" Nov 25 09:33:22 crc kubenswrapper[4565]: E1125 09:33:22.889428 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.893423 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" event={"ID":"0c32d371-4207-4e71-8031-a27b6562f9a2","Type":"ContainerStarted","Data":"c6450f05ffcc1722f117f1ef63ad0bc5ad57f82b6e88f287ab1627e6e83ce403"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.893788 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.897259 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" event={"ID":"d4a03edc-1b0f-4f50-bab7-b2292c453f4d","Type":"ContainerStarted","Data":"901e17ba904bc8b53591646194c8689bbd912495d01f9337bdeeba8e04d659d3"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.898234 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.901541 4565 generic.go:334] "Generic (PLEG): container finished" podID="3791b99a-d877-470f-8a8f-56f7b02be997" containerID="3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.901665 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerDied","Data":"3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.902148 4565 scope.go:117] "RemoveContainer" containerID="3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a" Nov 25 09:33:22 crc kubenswrapper[4565]: E1125 09:33:22.902400 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.905552 4565 generic.go:334] "Generic (PLEG): container finished" podID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" containerID="9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.905736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerDied","Data":"9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.906487 4565 scope.go:117] "RemoveContainer" containerID="9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9" Nov 25 09:33:22 crc kubenswrapper[4565]: E1125 09:33:22.907202 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.911760 4565 generic.go:334] "Generic (PLEG): container finished" podID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" containerID="f4d9d06ae72225fe26a297e05d2013bf19d707ca1336dba5413d1eba46865385" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.911873 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerDied","Data":"f4d9d06ae72225fe26a297e05d2013bf19d707ca1336dba5413d1eba46865385"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.912818 4565 scope.go:117] "RemoveContainer" containerID="f4d9d06ae72225fe26a297e05d2013bf19d707ca1336dba5413d1eba46865385" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.918906 4565 generic.go:334] "Generic (PLEG): container finished" podID="a933a688-5393-4b7b-b0b7-6ee5791970b1" containerID="41a861697ee0014a152c9966148b09cc39165d3506ca0b9e482fccdfc89b459b" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.918970 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerDied","Data":"41a861697ee0014a152c9966148b09cc39165d3506ca0b9e482fccdfc89b459b"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.919482 4565 scope.go:117] "RemoveContainer" containerID="41a861697ee0014a152c9966148b09cc39165d3506ca0b9e482fccdfc89b459b" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.919772 4565 scope.go:117] "RemoveContainer" containerID="cafda87c4a914bf7e4e5b3b61c210729540c87f72d5956d1d530ff48cab41135" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.925291 4565 generic.go:334] "Generic (PLEG): container finished" podID="4ee66804-213d-4e52-b04b-6b00eec8de2d" containerID="9d9e03500c8cfd815f40511263131618cb2a25d961e13106b01b9f44e9a7722a" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.925388 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerDied","Data":"9d9e03500c8cfd815f40511263131618cb2a25d961e13106b01b9f44e9a7722a"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.925818 4565 scope.go:117] "RemoveContainer" containerID="9d9e03500c8cfd815f40511263131618cb2a25d961e13106b01b9f44e9a7722a" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.932193 4565 generic.go:334] "Generic (PLEG): container finished" podID="333ae034-2972-4915-a547-364c01510827" containerID="7f09210820aabf170f0ed34c6d282ddf5e8dc64da36a077852721535b935977b" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.932265 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerDied","Data":"7f09210820aabf170f0ed34c6d282ddf5e8dc64da36a077852721535b935977b"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.932670 4565 scope.go:117] "RemoveContainer" containerID="7f09210820aabf170f0ed34c6d282ddf5e8dc64da36a077852721535b935977b" Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.937448 4565 generic.go:334] "Generic (PLEG): container finished" podID="a65931e1-7a1f-4251-9c4f-996b407dfb03" containerID="4343e2d1f5e01268a651827a151d27955c9600d9fd68b7ccbdda7e4c2374e3ac" exitCode=1 Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.937504 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerDied","Data":"4343e2d1f5e01268a651827a151d27955c9600d9fd68b7ccbdda7e4c2374e3ac"} Nov 25 09:33:22 crc kubenswrapper[4565]: I1125 09:33:22.938053 4565 scope.go:117] "RemoveContainer" containerID="4343e2d1f5e01268a651827a151d27955c9600d9fd68b7ccbdda7e4c2374e3ac" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.042998 4565 scope.go:117] "RemoveContainer" containerID="1e376211332e007ac9c6b0e3d3f6d9972a045be74d6395f12a7b6c09deac8e26" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.165128 4565 scope.go:117] "RemoveContainer" containerID="a12528fcdeb1de8e7dfa385a1d981fd380ed4e4bf30f3a9acc1ccde7d2b796b8" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.951132 4565 generic.go:334] "Generic (PLEG): container finished" podID="5275621d-5c51-4586-85f2-e0e24cb32266" containerID="917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.951793 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerDied","Data":"917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.951857 4565 scope.go:117] "RemoveContainer" containerID="b5592a5a7016586fd12310dd71ccecdf223089a0e072c008af0bdc1e65ad3008" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.952512 4565 scope.go:117] "RemoveContainer" containerID="917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.953004 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(5275621d-5c51-4586-85f2-e0e24cb32266)\"" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.957025 4565 generic.go:334] "Generic (PLEG): container finished" podID="a65931e1-7a1f-4251-9c4f-996b407dfb03" containerID="3b7f1e6786d926dfd011db64eaf83fb1f02bcfd5a1572e1f566eb1f2f2b2dc51" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.957094 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerDied","Data":"3b7f1e6786d926dfd011db64eaf83fb1f02bcfd5a1572e1f566eb1f2f2b2dc51"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.957676 4565 scope.go:117] "RemoveContainer" containerID="3b7f1e6786d926dfd011db64eaf83fb1f02bcfd5a1572e1f566eb1f2f2b2dc51" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.958161 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=operator pod=rabbitmq-cluster-operator-manager-668c99d594-s4llp_openstack-operators(a65931e1-7a1f-4251-9c4f-996b407dfb03)\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.972512 4565 generic.go:334] "Generic (PLEG): container finished" podID="4ee66804-213d-4e52-b04b-6b00eec8de2d" containerID="53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.972673 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerDied","Data":"53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.973902 4565 scope.go:117] "RemoveContainer" containerID="53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.974262 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.979455 4565 generic.go:334] "Generic (PLEG): container finished" podID="333ae034-2972-4915-a547-364c01510827" containerID="6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.979600 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerDied","Data":"6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.980129 4565 scope.go:117] "RemoveContainer" containerID="6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.980452 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.990203 4565 generic.go:334] "Generic (PLEG): container finished" podID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" containerID="c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.990324 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerDied","Data":"c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.990675 4565 scope.go:117] "RemoveContainer" containerID="c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.990900 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.994587 4565 generic.go:334] "Generic (PLEG): container finished" podID="a933a688-5393-4b7b-b0b7-6ee5791970b1" containerID="ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9" exitCode=1 Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.994655 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerDied","Data":"ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9"} Nov 25 09:33:23 crc kubenswrapper[4565]: I1125 09:33:23.995224 4565 scope.go:117] "RemoveContainer" containerID="ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9" Nov 25 09:33:23 crc kubenswrapper[4565]: E1125 09:33:23.995633 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:33:24 crc kubenswrapper[4565]: I1125 09:33:24.011493 4565 scope.go:117] "RemoveContainer" containerID="4343e2d1f5e01268a651827a151d27955c9600d9fd68b7ccbdda7e4c2374e3ac" Nov 25 09:33:24 crc kubenswrapper[4565]: I1125 09:33:24.061152 4565 scope.go:117] "RemoveContainer" containerID="9d9e03500c8cfd815f40511263131618cb2a25d961e13106b01b9f44e9a7722a" Nov 25 09:33:24 crc kubenswrapper[4565]: I1125 09:33:24.109652 4565 scope.go:117] "RemoveContainer" containerID="7f09210820aabf170f0ed34c6d282ddf5e8dc64da36a077852721535b935977b" Nov 25 09:33:24 crc kubenswrapper[4565]: I1125 09:33:24.161856 4565 scope.go:117] "RemoveContainer" containerID="f4d9d06ae72225fe26a297e05d2013bf19d707ca1336dba5413d1eba46865385" Nov 25 09:33:24 crc kubenswrapper[4565]: I1125 09:33:24.191732 4565 scope.go:117] "RemoveContainer" containerID="41a861697ee0014a152c9966148b09cc39165d3506ca0b9e482fccdfc89b459b" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.011475 4565 generic.go:334] "Generic (PLEG): container finished" podID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" containerID="c258ff8ac7399fce58283640d51f84b4bd8f8831fa5a6a1f461ab97cbe543478" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.011535 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerDied","Data":"c258ff8ac7399fce58283640d51f84b4bd8f8831fa5a6a1f461ab97cbe543478"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.012552 4565 scope.go:117] "RemoveContainer" containerID="c258ff8ac7399fce58283640d51f84b4bd8f8831fa5a6a1f461ab97cbe543478" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.014821 4565 scope.go:117] "RemoveContainer" containerID="917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c" Nov 25 09:33:25 crc kubenswrapper[4565]: E1125 09:33:25.015167 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(5275621d-5c51-4586-85f2-e0e24cb32266)\"" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.017980 4565 generic.go:334] "Generic (PLEG): container finished" podID="354fe5db-35d0-4d94-989c-02a077f8bd20" containerID="f2c4442747e0bb833c0acac848a2bc57dddd3b678a164abcae51fb4ac20c914b" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.018074 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerDied","Data":"f2c4442747e0bb833c0acac848a2bc57dddd3b678a164abcae51fb4ac20c914b"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.018814 4565 scope.go:117] "RemoveContainer" containerID="f2c4442747e0bb833c0acac848a2bc57dddd3b678a164abcae51fb4ac20c914b" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.023389 4565 generic.go:334] "Generic (PLEG): container finished" podID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" containerID="1bbe5a65350874311ae08c162c53a476b40076f44e2ffa9eb7261763920bee54" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.023459 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerDied","Data":"1bbe5a65350874311ae08c162c53a476b40076f44e2ffa9eb7261763920bee54"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.024079 4565 scope.go:117] "RemoveContainer" containerID="1bbe5a65350874311ae08c162c53a476b40076f44e2ffa9eb7261763920bee54" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.027103 4565 generic.go:334] "Generic (PLEG): container finished" podID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" containerID="5b628e980940a13fbfb319b158c9a6d712f3202fc04bb3a1f2f22cc6b3b1c08d" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.027166 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerDied","Data":"5b628e980940a13fbfb319b158c9a6d712f3202fc04bb3a1f2f22cc6b3b1c08d"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.028341 4565 scope.go:117] "RemoveContainer" containerID="5b628e980940a13fbfb319b158c9a6d712f3202fc04bb3a1f2f22cc6b3b1c08d" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.030886 4565 generic.go:334] "Generic (PLEG): container finished" podID="cf68120a-e894-4189-8035-91f8045618c0" containerID="5304514dba13319a2ba18742f4624c0238b46fe0a13ee331aca3412ae4390ac4" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.030961 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerDied","Data":"5304514dba13319a2ba18742f4624c0238b46fe0a13ee331aca3412ae4390ac4"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.034009 4565 scope.go:117] "RemoveContainer" containerID="5304514dba13319a2ba18742f4624c0238b46fe0a13ee331aca3412ae4390ac4" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.034645 4565 generic.go:334] "Generic (PLEG): container finished" podID="1af57713-55c3-45ec-b98b-1aac75a2d60b" containerID="eddbde34bce7f860b148d6cf4d6799a9409966d496b8aef9457d602b38d0841b" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.034716 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerDied","Data":"eddbde34bce7f860b148d6cf4d6799a9409966d496b8aef9457d602b38d0841b"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.035105 4565 scope.go:117] "RemoveContainer" containerID="eddbde34bce7f860b148d6cf4d6799a9409966d496b8aef9457d602b38d0841b" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.041110 4565 generic.go:334] "Generic (PLEG): container finished" podID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" containerID="3299c62482ca3385dd947f2c0b11108d7248f063e8aba8c0a6c08fd8f2e2a9fd" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.041180 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerDied","Data":"3299c62482ca3385dd947f2c0b11108d7248f063e8aba8c0a6c08fd8f2e2a9fd"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.041772 4565 scope.go:117] "RemoveContainer" containerID="3299c62482ca3385dd947f2c0b11108d7248f063e8aba8c0a6c08fd8f2e2a9fd" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.052404 4565 generic.go:334] "Generic (PLEG): container finished" podID="579400cf-d71f-47f4-a98e-b94ccbf4ff72" containerID="48ad5addee24793315efdcdd8e9716c707249289752040834ebe787d71abeced" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.052492 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" event={"ID":"579400cf-d71f-47f4-a98e-b94ccbf4ff72","Type":"ContainerDied","Data":"48ad5addee24793315efdcdd8e9716c707249289752040834ebe787d71abeced"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.053129 4565 scope.go:117] "RemoveContainer" containerID="48ad5addee24793315efdcdd8e9716c707249289752040834ebe787d71abeced" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.066323 4565 generic.go:334] "Generic (PLEG): container finished" podID="6402fac4-067f-4410-a00c-0d438d502f3c" containerID="39416707e848ac167d22107d737be21ebd583d251bb7d1c35a8224268cf783a3" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.066408 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerDied","Data":"39416707e848ac167d22107d737be21ebd583d251bb7d1c35a8224268cf783a3"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.067492 4565 scope.go:117] "RemoveContainer" containerID="39416707e848ac167d22107d737be21ebd583d251bb7d1c35a8224268cf783a3" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.068851 4565 generic.go:334] "Generic (PLEG): container finished" podID="d0ef0237-045a-4153-a377-07b2c9e6ceba" containerID="96ab7e0d1dab36db235dd8e5223a419423367ba478db8f4b4b224391ea1e08b4" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.069039 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerDied","Data":"96ab7e0d1dab36db235dd8e5223a419423367ba478db8f4b4b224391ea1e08b4"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.069921 4565 scope.go:117] "RemoveContainer" containerID="96ab7e0d1dab36db235dd8e5223a419423367ba478db8f4b4b224391ea1e08b4" Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.071073 4565 generic.go:334] "Generic (PLEG): container finished" podID="f2c67417-c283-4158-91ec-f49478a5378e" containerID="263b89793cc5d3a3a3cb480f19bc73a24f38dc269b9fcb0f53c578e6fa2c4509" exitCode=1 Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.071106 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerDied","Data":"263b89793cc5d3a3a3cb480f19bc73a24f38dc269b9fcb0f53c578e6fa2c4509"} Nov 25 09:33:25 crc kubenswrapper[4565]: I1125 09:33:25.071410 4565 scope.go:117] "RemoveContainer" containerID="263b89793cc5d3a3a3cb480f19bc73a24f38dc269b9fcb0f53c578e6fa2c4509" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.092834 4565 generic.go:334] "Generic (PLEG): container finished" podID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" containerID="c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.092945 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerDied","Data":"c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.093363 4565 scope.go:117] "RemoveContainer" containerID="5b628e980940a13fbfb319b158c9a6d712f3202fc04bb3a1f2f22cc6b3b1c08d" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.094678 4565 scope.go:117] "RemoveContainer" containerID="c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.095171 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-cxwrc_openstack-operators(873884b1-6ee8-400c-9ca2-0b0b3c4618e9)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podUID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.103909 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.107744 4565 generic.go:334] "Generic (PLEG): container finished" podID="f2c67417-c283-4158-91ec-f49478a5378e" containerID="6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.107825 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerDied","Data":"6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.108310 4565 scope.go:117] "RemoveContainer" containerID="6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.108576 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-n9bdd_openstack-operators(f2c67417-c283-4158-91ec-f49478a5378e)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podUID="f2c67417-c283-4158-91ec-f49478a5378e" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.110520 4565 generic.go:334] "Generic (PLEG): container finished" podID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" containerID="b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.110569 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerDied","Data":"b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.110888 4565 scope.go:117] "RemoveContainer" containerID="b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.111247 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-bd8d6_openstack-operators(93da1f7e-c5e8-4c9c-b6af-feb85c526b47)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podUID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.114596 4565 generic.go:334] "Generic (PLEG): container finished" podID="579400cf-d71f-47f4-a98e-b94ccbf4ff72" containerID="27dc1ea0afa359198b298e28f68b50312f5e0b24d681eb6e799596611d4da4dc" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.114697 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" event={"ID":"579400cf-d71f-47f4-a98e-b94ccbf4ff72","Type":"ContainerDied","Data":"27dc1ea0afa359198b298e28f68b50312f5e0b24d681eb6e799596611d4da4dc"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.115630 4565 scope.go:117] "RemoveContainer" containerID="27dc1ea0afa359198b298e28f68b50312f5e0b24d681eb6e799596611d4da4dc" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.116047 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-7cd5954d9-fkc7l_openstack-operators(579400cf-d71f-47f4-a98e-b94ccbf4ff72)\"" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" podUID="579400cf-d71f-47f4-a98e-b94ccbf4ff72" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.123710 4565 generic.go:334] "Generic (PLEG): container finished" podID="354fe5db-35d0-4d94-989c-02a077f8bd20" containerID="6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.123850 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerDied","Data":"6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.124507 4565 scope.go:117] "RemoveContainer" containerID="6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.125067 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-2s9lf_openstack-operators(354fe5db-35d0-4d94-989c-02a077f8bd20)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podUID="354fe5db-35d0-4d94-989c-02a077f8bd20" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.127226 4565 generic.go:334] "Generic (PLEG): container finished" podID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" containerID="aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.127273 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerDied","Data":"aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.128195 4565 scope.go:117] "RemoveContainer" containerID="aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.128500 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-f9bbj_openstack-operators(92be75e0-b60b-4f41-bde1-4f74a4d306e3)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podUID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.133585 4565 generic.go:334] "Generic (PLEG): container finished" podID="cf68120a-e894-4189-8035-91f8045618c0" containerID="58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.133700 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerDied","Data":"58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.134127 4565 scope.go:117] "RemoveContainer" containerID="58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.134591 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-lz6zt_openstack-operators(cf68120a-e894-4189-8035-91f8045618c0)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podUID="cf68120a-e894-4189-8035-91f8045618c0" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.137249 4565 generic.go:334] "Generic (PLEG): container finished" podID="1af57713-55c3-45ec-b98b-1aac75a2d60b" containerID="525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.137322 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerDied","Data":"525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.138133 4565 scope.go:117] "RemoveContainer" containerID="525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.138450 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-ddlth_openstack-operators(1af57713-55c3-45ec-b98b-1aac75a2d60b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podUID="1af57713-55c3-45ec-b98b-1aac75a2d60b" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.140939 4565 generic.go:334] "Generic (PLEG): container finished" podID="6402fac4-067f-4410-a00c-0d438d502f3c" containerID="df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.140992 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerDied","Data":"df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.141308 4565 scope.go:117] "RemoveContainer" containerID="df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.141526 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-mjsqx_openstack-operators(6402fac4-067f-4410-a00c-0d438d502f3c)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podUID="6402fac4-067f-4410-a00c-0d438d502f3c" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.143881 4565 generic.go:334] "Generic (PLEG): container finished" podID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" containerID="018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.143950 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerDied","Data":"018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.144249 4565 scope.go:117] "RemoveContainer" containerID="018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.144455 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.152278 4565 generic.go:334] "Generic (PLEG): container finished" podID="d0ef0237-045a-4153-a377-07b2c9e6ceba" containerID="8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510" exitCode=1 Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.152333 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerDied","Data":"8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510"} Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.153890 4565 scope.go:117] "RemoveContainer" containerID="8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510" Nov 25 09:33:26 crc kubenswrapper[4565]: E1125 09:33:26.154361 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.170123 4565 scope.go:117] "RemoveContainer" containerID="263b89793cc5d3a3a3cb480f19bc73a24f38dc269b9fcb0f53c578e6fa2c4509" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.228477 4565 scope.go:117] "RemoveContainer" containerID="3299c62482ca3385dd947f2c0b11108d7248f063e8aba8c0a6c08fd8f2e2a9fd" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.282886 4565 scope.go:117] "RemoveContainer" containerID="48ad5addee24793315efdcdd8e9716c707249289752040834ebe787d71abeced" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.333002 4565 scope.go:117] "RemoveContainer" containerID="f2c4442747e0bb833c0acac848a2bc57dddd3b678a164abcae51fb4ac20c914b" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.357675 4565 scope.go:117] "RemoveContainer" containerID="1bbe5a65350874311ae08c162c53a476b40076f44e2ffa9eb7261763920bee54" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.379126 4565 scope.go:117] "RemoveContainer" containerID="5304514dba13319a2ba18742f4624c0238b46fe0a13ee331aca3412ae4390ac4" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.397592 4565 scope.go:117] "RemoveContainer" containerID="eddbde34bce7f860b148d6cf4d6799a9409966d496b8aef9457d602b38d0841b" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.425941 4565 scope.go:117] "RemoveContainer" containerID="39416707e848ac167d22107d737be21ebd583d251bb7d1c35a8224268cf783a3" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.448837 4565 scope.go:117] "RemoveContainer" containerID="c258ff8ac7399fce58283640d51f84b4bd8f8831fa5a6a1f461ab97cbe543478" Nov 25 09:33:26 crc kubenswrapper[4565]: I1125 09:33:26.469812 4565 scope.go:117] "RemoveContainer" containerID="96ab7e0d1dab36db235dd8e5223a419423367ba478db8f4b4b224391ea1e08b4" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.164846 4565 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4b374521-d202-4b62-b3ae-ee994800c958" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.429467 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.430056 4565 scope.go:117] "RemoveContainer" containerID="c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.430355 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-cxwrc_openstack-operators(873884b1-6ee8-400c-9ca2-0b0b3c4618e9)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podUID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.478903 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.479302 4565 scope.go:117] "RemoveContainer" containerID="525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.479514 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-ddlth_openstack-operators(1af57713-55c3-45ec-b98b-1aac75a2d60b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podUID="1af57713-55c3-45ec-b98b-1aac75a2d60b" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.491122 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.491842 4565 scope.go:117] "RemoveContainer" containerID="b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.492272 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-bd8d6_openstack-operators(93da1f7e-c5e8-4c9c-b6af-feb85c526b47)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podUID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.505568 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.506076 4565 scope.go:117] "RemoveContainer" containerID="aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.506326 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-f9bbj_openstack-operators(92be75e0-b60b-4f41-bde1-4f74a4d306e3)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podUID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.527862 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.528937 4565 scope.go:117] "RemoveContainer" containerID="ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.529199 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.544866 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.545295 4565 scope.go:117] "RemoveContainer" containerID="6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.545528 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-2s9lf_openstack-operators(354fe5db-35d0-4d94-989c-02a077f8bd20)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podUID="354fe5db-35d0-4d94-989c-02a077f8bd20" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.641603 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.642546 4565 scope.go:117] "RemoveContainer" containerID="df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.642854 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-mjsqx_openstack-operators(6402fac4-067f-4410-a00c-0d438d502f3c)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podUID="6402fac4-067f-4410-a00c-0d438d502f3c" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.841844 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.842538 4565 scope.go:117] "RemoveContainer" containerID="58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.843121 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-lz6zt_openstack-operators(cf68120a-e894-4189-8035-91f8045618c0)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podUID="cf68120a-e894-4189-8035-91f8045618c0" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.873853 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.874693 4565 scope.go:117] "RemoveContainer" containerID="9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.874995 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.894865 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.895307 4565 scope.go:117] "RemoveContainer" containerID="53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.895565 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.978041 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.978532 4565 scope.go:117] "RemoveContainer" containerID="8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.978785 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.997390 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:33:27 crc kubenswrapper[4565]: I1125 09:33:27.997750 4565 scope.go:117] "RemoveContainer" containerID="6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5" Nov 25 09:33:27 crc kubenswrapper[4565]: E1125 09:33:27.997984 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-n9bdd_openstack-operators(f2c67417-c283-4158-91ec-f49478a5378e)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podUID="f2c67417-c283-4158-91ec-f49478a5378e" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.097629 4565 scope.go:117] "RemoveContainer" containerID="d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.170745 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.171437 4565 scope.go:117] "RemoveContainer" containerID="fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.172007 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-hrr6t_openstack-operators(6279e5b8-cc23-4b43-9554-754a61174bcd)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podUID="6279e5b8-cc23-4b43-9554-754a61174bcd" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.208212 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.208995 4565 scope.go:117] "RemoveContainer" containerID="6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.209278 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.218772 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.219271 4565 scope.go:117] "RemoveContainer" containerID="93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.219499 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-zz6wf_openstack-operators(052c7786-4d54-4af0-8598-91ff09cdf966)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podUID="052c7786-4d54-4af0-8598-91ff09cdf966" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.264856 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.266238 4565 scope.go:117] "RemoveContainer" containerID="c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.266745 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.308126 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.309149 4565 scope.go:117] "RemoveContainer" containerID="be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.309417 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.341351 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-sj4j7" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.371256 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.372295 4565 scope.go:117] "RemoveContainer" containerID="018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.372704 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.388866 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:33:28 crc kubenswrapper[4565]: I1125 09:33:28.390057 4565 scope.go:117] "RemoveContainer" containerID="3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a" Nov 25 09:33:28 crc kubenswrapper[4565]: E1125 09:33:28.390377 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:33:29 crc kubenswrapper[4565]: I1125 09:33:29.216737 4565 generic.go:334] "Generic (PLEG): container finished" podID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" containerID="a2d7d47dc55b713b12285c0b4f59ed400fdea491fccbf6452fa96db4f49cfb32" exitCode=1 Nov 25 09:33:29 crc kubenswrapper[4565]: I1125 09:33:29.216790 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerDied","Data":"a2d7d47dc55b713b12285c0b4f59ed400fdea491fccbf6452fa96db4f49cfb32"} Nov 25 09:33:29 crc kubenswrapper[4565]: I1125 09:33:29.216837 4565 scope.go:117] "RemoveContainer" containerID="d2c6381e3a2fc6186927332b31949292bc6820cbd9e22f6bcc8c79991c41cc26" Nov 25 09:33:29 crc kubenswrapper[4565]: I1125 09:33:29.217260 4565 scope.go:117] "RemoveContainer" containerID="a2d7d47dc55b713b12285c0b4f59ed400fdea491fccbf6452fa96db4f49cfb32" Nov 25 09:33:29 crc kubenswrapper[4565]: E1125 09:33:29.217512 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-74454849f9-fjwfp_metallb-system(145e5d59-fd78-4bc1-a97c-17ebf0d67fa4)\"" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" Nov 25 09:33:30 crc kubenswrapper[4565]: I1125 09:33:30.155844 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 09:33:30 crc kubenswrapper[4565]: I1125 09:33:30.325812 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-s8c4s" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.002019 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tx2k9" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.024785 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zbng8" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.259208 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.367508 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.367567 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.368146 4565 scope.go:117] "RemoveContainer" containerID="917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c" Nov 25 09:33:31 crc kubenswrapper[4565]: E1125 09:33:31.368421 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(5275621d-5c51-4586-85f2-e0e24cb32266)\"" pod="openstack/kube-state-metrics-0" podUID="5275621d-5c51-4586-85f2-e0e24cb32266" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.471781 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.611571 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.652802 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.687084 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.789477 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-sw4l6" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.824696 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cttsd" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.858978 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.884986 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 09:33:31 crc kubenswrapper[4565]: I1125 09:33:31.978161 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.003981 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.021516 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.098157 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.100626 4565 scope.go:117] "RemoveContainer" containerID="27dc1ea0afa359198b298e28f68b50312f5e0b24d681eb6e799596611d4da4dc" Nov 25 09:33:32 crc kubenswrapper[4565]: E1125 09:33:32.101049 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-7cd5954d9-fkc7l_openstack-operators(579400cf-d71f-47f4-a98e-b94ccbf4ff72)\"" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" podUID="579400cf-d71f-47f4-a98e-b94ccbf4ff72" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.128321 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.209954 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.253546 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.316091 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 09:33:32 crc kubenswrapper[4565]: I1125 09:33:32.994852 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.051594 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-z6jrc" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.119718 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.175202 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.292097 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.466615 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.483771 4565 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.529115 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.592518 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.602016 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.604161 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.673448 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.677985 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.696432 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.793030 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.841641 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.886788 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.928523 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.958978 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 09:33:33 crc kubenswrapper[4565]: I1125 09:33:33.961953 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.006345 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ntkbc" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.021161 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.088134 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.104135 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.113278 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tjxjs" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.171992 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.195582 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.239752 4565 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.322912 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9825s" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.347453 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.505452 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.516885 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.516980 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.520127 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.621496 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.627910 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.639163 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.639975 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 09:33:34 crc kubenswrapper[4565]: I1125 09:33:34.769321 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.024879 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zqs2w" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.070627 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.079585 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.092597 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fmk8b" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.120720 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.134391 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.170890 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.180626 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.199548 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.216746 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t466t" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.218232 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.224674 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.250881 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.264654 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.264836 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.428126 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.451867 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.496876 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.531152 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.553756 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.553808 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.569685 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.596168 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6rgj5" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.626513 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.626765 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.633160 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.634304 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.641534 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.642011 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.649125 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.663842 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-v7wmt" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.692578 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z8wjg" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.723305 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.782747 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.786746 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.800542 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.862313 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.864447 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 09:33:35 crc kubenswrapper[4565]: I1125 09:33:35.934996 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.035088 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.045478 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.098524 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.109809 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.134488 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.156902 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.201144 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.217720 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.223799 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.242003 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.285784 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.363539 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.365124 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s6c8r" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.384852 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jsfd9" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.447991 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.494344 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.504003 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.509246 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-69k9h" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.559791 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.561039 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.584007 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.584709 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.586809 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.618725 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.642015 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hrdhn" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.662177 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.662904 4565 scope.go:117] "RemoveContainer" containerID="a2d7d47dc55b713b12285c0b4f59ed400fdea491fccbf6452fa96db4f49cfb32" Nov 25 09:33:36 crc kubenswrapper[4565]: E1125 09:33:36.663194 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-74454849f9-fjwfp_metallb-system(145e5d59-fd78-4bc1-a97c-17ebf0d67fa4)\"" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" podUID="145e5d59-fd78-4bc1-a97c-17ebf0d67fa4" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.686146 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.692223 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.703362 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.731641 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.736681 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.753876 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.823202 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.831544 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.853940 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.888281 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.935892 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.968533 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 09:33:36 crc kubenswrapper[4565]: I1125 09:33:36.996234 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.043611 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.056685 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.091571 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.103651 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.123657 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.123678 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-j2q7d" Nov 25 09:33:37 crc kubenswrapper[4565]: E1125 09:33:37.124147 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.137392 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.143407 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.171617 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.203760 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.246142 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.259579 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.357690 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mlncl" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.363470 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.363774 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.367806 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8t6bx" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.384758 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fjg7f" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.415775 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.422863 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.430099 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.430604 4565 scope.go:117] "RemoveContainer" containerID="c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.479179 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.480250 4565 scope.go:117] "RemoveContainer" containerID="525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.490598 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.491018 4565 scope.go:117] "RemoveContainer" containerID="b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.501294 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xrp48" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.505423 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.506312 4565 scope.go:117] "RemoveContainer" containerID="aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.517634 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.527472 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.527853 4565 scope.go:117] "RemoveContainer" containerID="ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.545139 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.545479 4565 scope.go:117] "RemoveContainer" containerID="6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.548906 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.552347 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.576797 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgk4c" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.637644 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.641393 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.642296 4565 scope.go:117] "RemoveContainer" containerID="df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.668583 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-d9k2v" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.674525 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.815762 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.834032 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.841620 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.842349 4565 scope.go:117] "RemoveContainer" containerID="58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.852522 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.874023 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.876551 4565 scope.go:117] "RemoveContainer" containerID="9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.894915 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.899101 4565 scope.go:117] "RemoveContainer" containerID="53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.977772 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.978304 4565 scope.go:117] "RemoveContainer" containerID="8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.996577 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:33:37 crc kubenswrapper[4565]: I1125 09:33:37.997370 4565 scope.go:117] "RemoveContainer" containerID="6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.011180 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.018358 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.032551 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.096352 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.097165 4565 scope.go:117] "RemoveContainer" containerID="3b7f1e6786d926dfd011db64eaf83fb1f02bcfd5a1572e1f566eb1f2f2b2dc51" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.123231 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.171480 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.172345 4565 scope.go:117] "RemoveContainer" containerID="fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.208270 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.208686 4565 scope.go:117] "RemoveContainer" containerID="6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.219027 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.220174 4565 scope.go:117] "RemoveContainer" containerID="93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.230487 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.244879 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-n9qnr" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.262716 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.264055 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.265376 4565 scope.go:117] "RemoveContainer" containerID="c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.274893 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.309026 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.310659 4565 scope.go:117] "RemoveContainer" containerID="be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.317174 4565 generic.go:334] "Generic (PLEG): container finished" podID="d0ef0237-045a-4153-a377-07b2c9e6ceba" containerID="aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.317248 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerDied","Data":"aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.317283 4565 scope.go:117] "RemoveContainer" containerID="8eb6276b64d3d07c26e1e60c5d3c7c9a38cc98ea142fa0ef2f183d791476a510" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.317693 4565 scope.go:117] "RemoveContainer" containerID="aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.317997 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.323046 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.323547 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.325362 4565 generic.go:334] "Generic (PLEG): container finished" podID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" containerID="3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.325397 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerDied","Data":"3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.325776 4565 scope.go:117] "RemoveContainer" containerID="3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.325982 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-cxwrc_openstack-operators(873884b1-6ee8-400c-9ca2-0b0b3c4618e9)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podUID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.331266 4565 generic.go:334] "Generic (PLEG): container finished" podID="cf68120a-e894-4189-8035-91f8045618c0" containerID="0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.331340 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerDied","Data":"0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.332265 4565 scope.go:117] "RemoveContainer" containerID="0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.332941 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-lz6zt_openstack-operators(cf68120a-e894-4189-8035-91f8045618c0)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podUID="cf68120a-e894-4189-8035-91f8045618c0" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.341849 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.342300 4565 generic.go:334] "Generic (PLEG): container finished" podID="a933a688-5393-4b7b-b0b7-6ee5791970b1" containerID="0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.342350 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerDied","Data":"0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.342800 4565 scope.go:117] "RemoveContainer" containerID="0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.343006 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.350837 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.355487 4565 generic.go:334] "Generic (PLEG): container finished" podID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" containerID="bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.355554 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerDied","Data":"bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.356343 4565 scope.go:117] "RemoveContainer" containerID="bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.356843 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.362832 4565 generic.go:334] "Generic (PLEG): container finished" podID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" containerID="edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.362896 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerDied","Data":"edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.363058 4565 scope.go:117] "RemoveContainer" containerID="c6877b4d4be1ca791052e41504ccb21d544739bdcad6fd6483be3682df29e3f4" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.363528 4565 scope.go:117] "RemoveContainer" containerID="edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.363826 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-f9bbj_openstack-operators(92be75e0-b60b-4f41-bde1-4f74a4d306e3)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podUID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.367119 4565 generic.go:334] "Generic (PLEG): container finished" podID="1af57713-55c3-45ec-b98b-1aac75a2d60b" containerID="60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.367173 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerDied","Data":"60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.367941 4565 scope.go:117] "RemoveContainer" containerID="60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.368172 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-ddlth_openstack-operators(1af57713-55c3-45ec-b98b-1aac75a2d60b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podUID="1af57713-55c3-45ec-b98b-1aac75a2d60b" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.371171 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.371751 4565 scope.go:117] "RemoveContainer" containerID="018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.382808 4565 generic.go:334] "Generic (PLEG): container finished" podID="f2c67417-c283-4158-91ec-f49478a5378e" containerID="8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.382863 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerDied","Data":"8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.383754 4565 scope.go:117] "RemoveContainer" containerID="8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.384045 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-n9bdd_openstack-operators(f2c67417-c283-4158-91ec-f49478a5378e)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podUID="f2c67417-c283-4158-91ec-f49478a5378e" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.386701 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.388370 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.389194 4565 scope.go:117] "RemoveContainer" containerID="3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.390980 4565 generic.go:334] "Generic (PLEG): container finished" podID="354fe5db-35d0-4d94-989c-02a077f8bd20" containerID="39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.391040 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerDied","Data":"39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.391541 4565 scope.go:117] "RemoveContainer" containerID="39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.391781 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-2s9lf_openstack-operators(354fe5db-35d0-4d94-989c-02a077f8bd20)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podUID="354fe5db-35d0-4d94-989c-02a077f8bd20" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.401802 4565 generic.go:334] "Generic (PLEG): container finished" podID="4ee66804-213d-4e52-b04b-6b00eec8de2d" containerID="f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.401881 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerDied","Data":"f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.402595 4565 scope.go:117] "RemoveContainer" containerID="f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.403061 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.419066 4565 generic.go:334] "Generic (PLEG): container finished" podID="6402fac4-067f-4410-a00c-0d438d502f3c" containerID="b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.419122 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerDied","Data":"b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.419758 4565 scope.go:117] "RemoveContainer" containerID="b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.420104 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-mjsqx_openstack-operators(6402fac4-067f-4410-a00c-0d438d502f3c)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podUID="6402fac4-067f-4410-a00c-0d438d502f3c" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.444665 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.454186 4565 scope.go:117] "RemoveContainer" containerID="58ec3e507272ad66937820573df826cb681a2838d59b3fcf35ea899e99ac1d26" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.454429 4565 generic.go:334] "Generic (PLEG): container finished" podID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" containerID="333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a" exitCode=1 Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.454564 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerDied","Data":"333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a"} Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.455504 4565 scope.go:117] "RemoveContainer" containerID="333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a" Nov 25 09:33:38 crc kubenswrapper[4565]: E1125 09:33:38.457111 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-bd8d6_openstack-operators(93da1f7e-c5e8-4c9c-b6af-feb85c526b47)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podUID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.481779 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.487106 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.494904 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.504512 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.529043 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.556443 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.560778 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.564106 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.566818 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.684994 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.746400 4565 scope.go:117] "RemoveContainer" containerID="ae884bbc6cb6f17604592c43ea4a5a4e3e0cc2967174a7f3bf1aece6bbdcd1b9" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.765717 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.780724 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.809536 4565 scope.go:117] "RemoveContainer" containerID="9ad324ad7207631b4c3ace3dfa8012001ad2d404f7545a3ef392b5cb2c4f0dc9" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.823844 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.867565 4565 scope.go:117] "RemoveContainer" containerID="aafb0d9aabc1c967ac2bae0d9486472dbe063252bab7ac1a2974b7a1c2fe55ae" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.872449 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.892355 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-95smf" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.893809 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.901033 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.911278 4565 scope.go:117] "RemoveContainer" containerID="525ec3b9b678e82ea9c648ffcb9ff3582f8f5209a588bcf379f896c66f66fe14" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.931423 4565 scope.go:117] "RemoveContainer" containerID="6cdcbd3a99dad41a9490820cbd4caba91b874c3084db74620830cfb37f590db5" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.937127 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hm54w" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.963237 4565 scope.go:117] "RemoveContainer" containerID="6f866ba4259bb4c97427e615758be3fe9be7d64fc33e85de34f9931a2e9f42e0" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.987508 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 09:33:38 crc kubenswrapper[4565]: I1125 09:33:38.995962 4565 scope.go:117] "RemoveContainer" containerID="53c43e5c9a264f3924b9a24e8612c83cf005f784b2024e5ac0fb03b06976dffe" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.013182 4565 scope.go:117] "RemoveContainer" containerID="df5c25075caf8dd5cd697e3a0f68e7c41b80f47ff63499be753dc30d43daa6b3" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.017483 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.017658 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.044573 4565 scope.go:117] "RemoveContainer" containerID="b72b884caf1f74bd9d07b5f310d0199b25597337fb741a6dba0515a2d0f4ce2a" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.056352 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.092152 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.115843 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.200505 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.204904 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.238727 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.322036 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.355795 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.406891 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.410586 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.432034 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.461908 4565 generic.go:334] "Generic (PLEG): container finished" podID="6279e5b8-cc23-4b43-9554-754a61174bcd" containerID="e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.461981 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerDied","Data":"e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.462014 4565 scope.go:117] "RemoveContainer" containerID="fece14f8dd7ae8608f19ace2a9ae19b0a2f7ddf82794bc19a15fe2f63af6931b" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.462544 4565 scope.go:117] "RemoveContainer" containerID="e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.462762 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-hrr6t_openstack-operators(6279e5b8-cc23-4b43-9554-754a61174bcd)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podUID="6279e5b8-cc23-4b43-9554-754a61174bcd" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.465425 4565 generic.go:334] "Generic (PLEG): container finished" podID="333ae034-2972-4915-a547-364c01510827" containerID="83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.465475 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerDied","Data":"83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.465796 4565 scope.go:117] "RemoveContainer" containerID="83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.466033 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.489702 4565 generic.go:334] "Generic (PLEG): container finished" podID="f35f4446-328e-40d3-96d6-2bc814fb8a96" containerID="6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.489756 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerDied","Data":"6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.490157 4565 scope.go:117] "RemoveContainer" containerID="6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.490346 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.498991 4565 generic.go:334] "Generic (PLEG): container finished" podID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" containerID="e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.499023 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerDied","Data":"e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.499329 4565 scope.go:117] "RemoveContainer" containerID="e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.499536 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.505204 4565 generic.go:334] "Generic (PLEG): container finished" podID="a65931e1-7a1f-4251-9c4f-996b407dfb03" containerID="7b2b75825ec45ab3cc5d2eb938e19c7ef8a8f4f5f69d192abc169acd1f2738ab" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.505229 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerDied","Data":"7b2b75825ec45ab3cc5d2eb938e19c7ef8a8f4f5f69d192abc169acd1f2738ab"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.505761 4565 scope.go:117] "RemoveContainer" containerID="7b2b75825ec45ab3cc5d2eb938e19c7ef8a8f4f5f69d192abc169acd1f2738ab" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.505964 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-manager-668c99d594-s4llp_openstack-operators(a65931e1-7a1f-4251-9c4f-996b407dfb03)\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.510907 4565 generic.go:334] "Generic (PLEG): container finished" podID="3791b99a-d877-470f-8a8f-56f7b02be997" containerID="43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.510986 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerDied","Data":"43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.511755 4565 scope.go:117] "RemoveContainer" containerID="43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.512002 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.513491 4565 generic.go:334] "Generic (PLEG): container finished" podID="052c7786-4d54-4af0-8598-91ff09cdf966" containerID="9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.513560 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerDied","Data":"9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.513875 4565 scope.go:117] "RemoveContainer" containerID="9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.514081 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-zz6wf_openstack-operators(052c7786-4d54-4af0-8598-91ff09cdf966)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podUID="052c7786-4d54-4af0-8598-91ff09cdf966" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.527642 4565 generic.go:334] "Generic (PLEG): container finished" podID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" containerID="2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82" exitCode=1 Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.527688 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerDied","Data":"2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82"} Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.528565 4565 scope.go:117] "RemoveContainer" containerID="2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82" Nov 25 09:33:39 crc kubenswrapper[4565]: E1125 09:33:39.528839 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.541974 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.542442 4565 scope.go:117] "RemoveContainer" containerID="6f09f6f09b8e3f7229a752e45171713583202cfe7a6b50597a068b0deee5b8cb" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.550403 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.583553 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.594656 4565 scope.go:117] "RemoveContainer" containerID="be1e26f0c8276ef465c6ecc95e1f2a46dc830a4eb6cf271efa889dba945f1a1c" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.622065 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.638174 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.640152 4565 scope.go:117] "RemoveContainer" containerID="018c92578f58e2078e4d374e10435340fcbf886cd048eea55abfd31cf9ca2e6b" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.710876 4565 scope.go:117] "RemoveContainer" containerID="3b7f1e6786d926dfd011db64eaf83fb1f02bcfd5a1572e1f566eb1f2f2b2dc51" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.733821 4565 scope.go:117] "RemoveContainer" containerID="3808caf582fae45d0177a3cc8c982ff3deba632f9ec8ec60e01eb6832b54630a" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.739344 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.750822 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.758872 4565 scope.go:117] "RemoveContainer" containerID="93384b9e296e68ef1b6e49f054b8a77404bc2ef513cceba546e0c57ea53bde50" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.762133 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.769071 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-f7lv9" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.779299 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.782757 4565 scope.go:117] "RemoveContainer" containerID="c7fdc354d4332496f54d6bac287a77e5366ec2b3bdffb36d0201e129de32241c" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.788672 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.808217 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.837531 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.885889 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.890748 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.902992 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.922407 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.953360 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.972038 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 09:33:39 crc kubenswrapper[4565]: I1125 09:33:39.995365 4565 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-thjvr" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.029828 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.061146 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.075613 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.133379 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.158368 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.170704 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.220979 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.257820 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.273901 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.285710 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.314232 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.340284 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.343960 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.393198 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.450450 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.476953 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.494414 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gccb8" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.502797 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.526632 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.536260 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.541896 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.630584 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.640086 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.679373 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.840088 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.945689 4565 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.961534 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 09:33:40 crc kubenswrapper[4565]: I1125 09:33:40.975147 4565 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.008493 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.034355 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.094406 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.160721 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.206095 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.227178 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4dlcq" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.306175 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.307575 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.339175 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.352743 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.444383 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.453754 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.468627 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.484888 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kkgwb" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.494160 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.497130 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.513238 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7kkqt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.527004 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.530032 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.541243 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.543949 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jxwmh" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.553848 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.609224 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.612908 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.624914 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.714402 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4kxdq" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.760629 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.856494 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.876702 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5wxvg" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.879675 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.882690 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.889034 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.907548 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6bz9q" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.911775 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.934661 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.956171 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 09:33:41 crc kubenswrapper[4565]: I1125 09:33:41.988002 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.007575 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.019921 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7kwx8" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.044627 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.044628 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.047836 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.070172 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.083424 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.098159 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.098532 4565 scope.go:117] "RemoveContainer" containerID="27dc1ea0afa359198b298e28f68b50312f5e0b24d681eb6e799596611d4da4dc" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.100477 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xg26s" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.151674 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.163565 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.247956 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qrbtx" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.330564 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.341294 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.368322 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.376687 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.405372 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.480187 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.514524 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kgbtq" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.542898 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-d5gvb" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.567317 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.575300 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.587794 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" event={"ID":"579400cf-d71f-47f4-a98e-b94ccbf4ff72","Type":"ContainerStarted","Data":"87a32bcd825c194f85a4818ea5de3baf86646e7b6da1d50147adb1172c46d7b9"} Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.587992 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.602127 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.624286 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.674589 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.675320 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.786602 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.790335 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.862714 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.900395 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.901717 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.903702 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.908749 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.933342 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.973713 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xmrf5" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.975584 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 09:33:42 crc kubenswrapper[4565]: I1125 09:33:42.987794 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.046050 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.080994 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q77j7" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.088603 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.089657 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.105684 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.131253 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.132139 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.134659 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.165807 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.257226 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.289326 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.404453 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.423140 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.467877 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xm9gt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.515683 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kpqnd" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.527958 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sd5bm" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.549858 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.710560 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.732638 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.796391 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.826270 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.902868 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-g9jgd" Nov 25 09:33:43 crc kubenswrapper[4565]: I1125 09:33:43.997405 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.006664 4565 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.044074 4565 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.055191 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.055170554 podStartE2EDuration="45.055170554s" podCreationTimestamp="2025-11-25 09:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:33:20.609316647 +0000 UTC m=+1733.811811785" watchObservedRunningTime="2025-11-25 09:33:44.055170554 +0000 UTC m=+1757.257665693" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.060399 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.064141 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.064204 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.076250 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.076335 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.080512 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.096522 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.096505319 podStartE2EDuration="24.096505319s" podCreationTimestamp="2025-11-25 09:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:33:44.094867431 +0000 UTC m=+1757.297362569" watchObservedRunningTime="2025-11-25 09:33:44.096505319 +0000 UTC m=+1757.299000458" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.101770 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ss96w" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.110078 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.165136 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.330259 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.427587 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.437983 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.439556 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.459024 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.469576 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.566848 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.570095 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.586310 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.628279 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.854815 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.929054 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 09:33:44 crc kubenswrapper[4565]: I1125 09:33:44.938132 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.002577 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.016705 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.208454 4565 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.413977 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.515663 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.553261 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.563173 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.623332 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.663118 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.669552 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.755544 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 09:33:45 crc kubenswrapper[4565]: I1125 09:33:45.962807 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.035138 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.096802 4565 scope.go:117] "RemoveContainer" containerID="917974a4a670e3c73c3ea6efd2300d879b57787c0c729123a5d7a66c69917b4c" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.098990 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.156335 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.196657 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.214043 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mm6kb" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.307469 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.400034 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.431536 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.474587 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.519300 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ln4nc" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.566408 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.607339 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.620609 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5275621d-5c51-4586-85f2-e0e24cb32266","Type":"ContainerStarted","Data":"5a1bf3fc3ca67ad618cec23b0a08ec334eef609c39bbf1126b1b6151fb026cac"} Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.621358 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.685980 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r2h4v" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.804721 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 09:33:46 crc kubenswrapper[4565]: I1125 09:33:46.951891 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.060052 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.300342 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.430107 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.430595 4565 scope.go:117] "RemoveContainer" containerID="3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.430831 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-cxwrc_openstack-operators(873884b1-6ee8-400c-9ca2-0b0b3c4618e9)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podUID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.478996 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.479461 4565 scope.go:117] "RemoveContainer" containerID="60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.479669 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-ddlth_openstack-operators(1af57713-55c3-45ec-b98b-1aac75a2d60b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podUID="1af57713-55c3-45ec-b98b-1aac75a2d60b" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.490967 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.491332 4565 scope.go:117] "RemoveContainer" containerID="333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.491511 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-bd8d6_openstack-operators(93da1f7e-c5e8-4c9c-b6af-feb85c526b47)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podUID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.504848 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.505774 4565 scope.go:117] "RemoveContainer" containerID="edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.506045 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-f9bbj_openstack-operators(92be75e0-b60b-4f41-bde1-4f74a4d306e3)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podUID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.525769 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.528825 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.530094 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.530627 4565 scope.go:117] "RemoveContainer" containerID="0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.530863 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.541541 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.542551 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.542910 4565 scope.go:117] "RemoveContainer" containerID="39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.543164 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-2s9lf_openstack-operators(354fe5db-35d0-4d94-989c-02a077f8bd20)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podUID="354fe5db-35d0-4d94-989c-02a077f8bd20" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.573496 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.641779 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.642276 4565 scope.go:117] "RemoveContainer" containerID="b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.642568 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-mjsqx_openstack-operators(6402fac4-067f-4410-a00c-0d438d502f3c)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podUID="6402fac4-067f-4410-a00c-0d438d502f3c" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.681798 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.836115 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.841849 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.842622 4565 scope.go:117] "RemoveContainer" containerID="0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.842965 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-lz6zt_openstack-operators(cf68120a-e894-4189-8035-91f8045618c0)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podUID="cf68120a-e894-4189-8035-91f8045618c0" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.851079 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bbdc8" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.874161 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.875302 4565 scope.go:117] "RemoveContainer" containerID="bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.875568 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.895057 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.895454 4565 scope.go:117] "RemoveContainer" containerID="f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.895684 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.977729 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.978511 4565 scope.go:117] "RemoveContainer" containerID="aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb" Nov 25 09:33:47 crc kubenswrapper[4565]: E1125 09:33:47.978805 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.996856 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:33:47 crc kubenswrapper[4565]: I1125 09:33:47.998812 4565 scope.go:117] "RemoveContainer" containerID="8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.000486 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-n9bdd_openstack-operators(f2c67417-c283-4158-91ec-f49478a5378e)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podUID="f2c67417-c283-4158-91ec-f49478a5378e" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.171425 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.171899 4565 scope.go:117] "RemoveContainer" containerID="e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.172147 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-hrr6t_openstack-operators(6279e5b8-cc23-4b43-9554-754a61174bcd)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podUID="6279e5b8-cc23-4b43-9554-754a61174bcd" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.208419 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.209721 4565 scope.go:117] "RemoveContainer" containerID="83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.210127 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.218492 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.219195 4565 scope.go:117] "RemoveContainer" containerID="9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.219487 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-zz6wf_openstack-operators(052c7786-4d54-4af0-8598-91ff09cdf966)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podUID="052c7786-4d54-4af0-8598-91ff09cdf966" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.264095 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.264807 4565 scope.go:117] "RemoveContainer" containerID="2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.265090 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.307230 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.307776 4565 scope.go:117] "RemoveContainer" containerID="6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.308099 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.366884 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.371015 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.371837 4565 scope.go:117] "RemoveContainer" containerID="e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.372855 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.388411 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.388863 4565 scope.go:117] "RemoveContainer" containerID="43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6" Nov 25 09:33:48 crc kubenswrapper[4565]: E1125 09:33:48.389121 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.465567 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 09:33:48 crc kubenswrapper[4565]: I1125 09:33:48.713875 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 09:33:49 crc kubenswrapper[4565]: I1125 09:33:49.097490 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:33:49 crc kubenswrapper[4565]: E1125 09:33:49.097877 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:33:49 crc kubenswrapper[4565]: I1125 09:33:49.580344 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 09:33:49 crc kubenswrapper[4565]: I1125 09:33:49.625711 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 09:33:50 crc kubenswrapper[4565]: I1125 09:33:50.097789 4565 scope.go:117] "RemoveContainer" containerID="7b2b75825ec45ab3cc5d2eb938e19c7ef8a8f4f5f69d192abc169acd1f2738ab" Nov 25 09:33:50 crc kubenswrapper[4565]: E1125 09:33:50.098083 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-manager-668c99d594-s4llp_openstack-operators(a65931e1-7a1f-4251-9c4f-996b407dfb03)\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" podUID="a65931e1-7a1f-4251-9c4f-996b407dfb03" Nov 25 09:33:50 crc kubenswrapper[4565]: I1125 09:33:50.098596 4565 scope.go:117] "RemoveContainer" containerID="a2d7d47dc55b713b12285c0b4f59ed400fdea491fccbf6452fa96db4f49cfb32" Nov 25 09:33:50 crc kubenswrapper[4565]: I1125 09:33:50.659570 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" event={"ID":"145e5d59-fd78-4bc1-a97c-17ebf0d67fa4","Type":"ContainerStarted","Data":"230229d1ccb4699f287cbc28281231bdebe426db2cb1db3c0c405571b1d07f7d"} Nov 25 09:33:50 crc kubenswrapper[4565]: I1125 09:33:50.660176 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:33:51 crc kubenswrapper[4565]: I1125 09:33:51.373858 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 09:33:52 crc kubenswrapper[4565]: I1125 09:33:52.105651 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-fkc7l" Nov 25 09:33:54 crc kubenswrapper[4565]: I1125 09:33:54.538366 4565 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 09:33:54 crc kubenswrapper[4565]: I1125 09:33:54.538945 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://db447d0f0f9f9858e01a52d6842986ff54040a5bc14d30ef9a5749f3f51caf39" gracePeriod=5 Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.430580 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.431773 4565 scope.go:117] "RemoveContainer" containerID="3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.432097 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-cxwrc_openstack-operators(873884b1-6ee8-400c-9ca2-0b0b3c4618e9)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" podUID="873884b1-6ee8-400c-9ca2-0b0b3c4618e9" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.479134 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.479577 4565 scope.go:117] "RemoveContainer" containerID="60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.479839 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-ddlth_openstack-operators(1af57713-55c3-45ec-b98b-1aac75a2d60b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" podUID="1af57713-55c3-45ec-b98b-1aac75a2d60b" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.491112 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.491616 4565 scope.go:117] "RemoveContainer" containerID="333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.491883 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-bd8d6_openstack-operators(93da1f7e-c5e8-4c9c-b6af-feb85c526b47)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" podUID="93da1f7e-c5e8-4c9c-b6af-feb85c526b47" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.505467 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.505966 4565 scope.go:117] "RemoveContainer" containerID="edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.506206 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-f9bbj_openstack-operators(92be75e0-b60b-4f41-bde1-4f74a4d306e3)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" podUID="92be75e0-b60b-4f41-bde1-4f74a4d306e3" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.529393 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.530367 4565 scope.go:117] "RemoveContainer" containerID="0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.530735 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-t68ww_openstack-operators(a933a688-5393-4b7b-b0b7-6ee5791970b1)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" podUID="a933a688-5393-4b7b-b0b7-6ee5791970b1" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.545077 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.546138 4565 scope.go:117] "RemoveContainer" containerID="39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.546657 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-2s9lf_openstack-operators(354fe5db-35d0-4d94-989c-02a077f8bd20)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" podUID="354fe5db-35d0-4d94-989c-02a077f8bd20" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.641138 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.641968 4565 scope.go:117] "RemoveContainer" containerID="b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.642316 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-mjsqx_openstack-operators(6402fac4-067f-4410-a00c-0d438d502f3c)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" podUID="6402fac4-067f-4410-a00c-0d438d502f3c" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.841531 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.843725 4565 scope.go:117] "RemoveContainer" containerID="0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.844073 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-lz6zt_openstack-operators(cf68120a-e894-4189-8035-91f8045618c0)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" podUID="cf68120a-e894-4189-8035-91f8045618c0" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.873344 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.873781 4565 scope.go:117] "RemoveContainer" containerID="bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.874032 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-pcqxq_openstack-operators(d5be161b-0f0c-485e-b1c7-50a9fff4b053)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" podUID="d5be161b-0f0c-485e-b1c7-50a9fff4b053" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.895447 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.895956 4565 scope.go:117] "RemoveContainer" containerID="f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.896185 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-2gkww_openstack-operators(4ee66804-213d-4e52-b04b-6b00eec8de2d)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" podUID="4ee66804-213d-4e52-b04b-6b00eec8de2d" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.977970 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.979174 4565 scope.go:117] "RemoveContainer" containerID="aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.979627 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-pzd74_openstack-operators(d0ef0237-045a-4153-a377-07b2c9e6ceba)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" podUID="d0ef0237-045a-4153-a377-07b2c9e6ceba" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.997455 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:33:57 crc kubenswrapper[4565]: I1125 09:33:57.998719 4565 scope.go:117] "RemoveContainer" containerID="8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569" Nov 25 09:33:57 crc kubenswrapper[4565]: E1125 09:33:57.999555 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-n9bdd_openstack-operators(f2c67417-c283-4158-91ec-f49478a5378e)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" podUID="f2c67417-c283-4158-91ec-f49478a5378e" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.171639 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.172862 4565 scope.go:117] "RemoveContainer" containerID="e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.173284 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-hrr6t_openstack-operators(6279e5b8-cc23-4b43-9554-754a61174bcd)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" podUID="6279e5b8-cc23-4b43-9554-754a61174bcd" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.207648 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.208634 4565 scope.go:117] "RemoveContainer" containerID="83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.209120 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-2q9rf_openstack-operators(333ae034-2972-4915-a547-364c01510827)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" podUID="333ae034-2972-4915-a547-364c01510827" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.218974 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.219670 4565 scope.go:117] "RemoveContainer" containerID="9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.220000 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-zz6wf_openstack-operators(052c7786-4d54-4af0-8598-91ff09cdf966)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" podUID="052c7786-4d54-4af0-8598-91ff09cdf966" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.263997 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.265152 4565 scope.go:117] "RemoveContainer" containerID="2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.265477 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-kgn59_openstack-operators(31dbf471-6fab-4ddd-a384-e4dd5335d5dc)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" podUID="31dbf471-6fab-4ddd-a384-e4dd5335d5dc" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.307999 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.308588 4565 scope.go:117] "RemoveContainer" containerID="6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.308825 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-zl2jr_openstack-operators(f35f4446-328e-40d3-96d6-2bc814fb8a96)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" podUID="f35f4446-328e-40d3-96d6-2bc814fb8a96" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.371020 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.371965 4565 scope.go:117] "RemoveContainer" containerID="e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.372279 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-7dzx4_openstack-operators(1ef630cb-2220-41f5-8a3d-66a2a78ce0ce)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" podUID="1ef630cb-2220-41f5-8a3d-66a2a78ce0ce" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.388361 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:33:58 crc kubenswrapper[4565]: I1125 09:33:58.389012 4565 scope.go:117] "RemoveContainer" containerID="43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6" Nov 25 09:33:58 crc kubenswrapper[4565]: E1125 09:33:58.389280 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-v2c96_openstack-operators(3791b99a-d877-470f-8a8f-56f7b02be997)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" podUID="3791b99a-d877-470f-8a8f-56f7b02be997" Nov 25 09:33:59 crc kubenswrapper[4565]: I1125 09:33:59.761655 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 09:33:59 crc kubenswrapper[4565]: I1125 09:33:59.761727 4565 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="db447d0f0f9f9858e01a52d6842986ff54040a5bc14d30ef9a5749f3f51caf39" exitCode=137 Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.138001 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.138515 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.221813 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.221889 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222070 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222210 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222112 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222298 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222411 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222492 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.222546 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.223384 4565 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.223412 4565 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.223423 4565 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.223438 4565 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.231097 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.324463 4565 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.772916 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.773061 4565 scope.go:117] "RemoveContainer" containerID="db447d0f0f9f9858e01a52d6842986ff54040a5bc14d30ef9a5749f3f51caf39" Nov 25 09:34:00 crc kubenswrapper[4565]: I1125 09:34:00.773145 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.097948 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:34:01 crc kubenswrapper[4565]: E1125 09:34:01.098427 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.107592 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.107886 4565 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.123662 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.123703 4565 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8a049fc4-4d68-4a40-8d98-872fc27d3494" Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.131895 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 09:34:01 crc kubenswrapper[4565]: I1125 09:34:01.132314 4565 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8a049fc4-4d68-4a40-8d98-872fc27d3494" Nov 25 09:34:04 crc kubenswrapper[4565]: I1125 09:34:04.098160 4565 scope.go:117] "RemoveContainer" containerID="7b2b75825ec45ab3cc5d2eb938e19c7ef8a8f4f5f69d192abc169acd1f2738ab" Nov 25 09:34:04 crc kubenswrapper[4565]: I1125 09:34:04.807148 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4llp" event={"ID":"a65931e1-7a1f-4251-9c4f-996b407dfb03","Type":"ContainerStarted","Data":"a66e1f74c011a7d48387be814de21f163958ea38c4d67db83f286d13e831c546"} Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.098033 4565 scope.go:117] "RemoveContainer" containerID="333feec57ee5037f47acd2d672641c026f1178aa1db59f23cd4d0a0a332c736a" Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.099306 4565 scope.go:117] "RemoveContainer" containerID="9564dbfc50d702602f079771897f60dfc5280e47ad4f85398621b879eb1202d5" Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.849658 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" event={"ID":"93da1f7e-c5e8-4c9c-b6af-feb85c526b47","Type":"ContainerStarted","Data":"d47aec414026c2aa1f61d6259749d8c0eb45f51bce982c040d54c57a5edc8e88"} Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.850135 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.852240 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" event={"ID":"052c7786-4d54-4af0-8598-91ff09cdf966","Type":"ContainerStarted","Data":"08e815e4764e9e68f27db9e46d7e626c50a4f8a64e16c2fa5e9952da543e3483"} Nov 25 09:34:09 crc kubenswrapper[4565]: I1125 09:34:09.852835 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.097741 4565 scope.go:117] "RemoveContainer" containerID="e020be677c47fe23cd5f8ad7616cb3050bc6e134ad73783fef6fa49aa2df1f5a" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.097800 4565 scope.go:117] "RemoveContainer" containerID="8ff33403676c987edafe00cc1b1214a3de810b90129aa373b2a8bfb758a9d569" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.097862 4565 scope.go:117] "RemoveContainer" containerID="e16b7b97243bbe5b43747b8ade9ddfb8233607897b27a4a30e3f397b34533859" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.098122 4565 scope.go:117] "RemoveContainer" containerID="b1c176493b938f65a21d4e1e1a579662243e2bafa7e6c3ac777aa37c48e17c7c" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.098371 4565 scope.go:117] "RemoveContainer" containerID="39ce5a59d7c7c5041aa9cbe05a8f612a97ad284fa3bb48e4557c04e282d6796d" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.868062 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" event={"ID":"6402fac4-067f-4410-a00c-0d438d502f3c","Type":"ContainerStarted","Data":"6007a5c2c79923bb2bc4945f5491e3fe05a2751cdfa458b08751684a9ee64ebf"} Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.868567 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.872709 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" event={"ID":"f2c67417-c283-4158-91ec-f49478a5378e","Type":"ContainerStarted","Data":"5f0c563e84f1c3fdbee0ae048b644d330803cbd8382f8e518a04efae6203d567"} Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.873214 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.877095 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" event={"ID":"354fe5db-35d0-4d94-989c-02a077f8bd20","Type":"ContainerStarted","Data":"7295bc8f81442c47c6eee6a353cb14e331650228f0a659ec8ad305f3e1455f47"} Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.877474 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.884820 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" event={"ID":"6279e5b8-cc23-4b43-9554-754a61174bcd","Type":"ContainerStarted","Data":"6c0e047e2efde19954fead3a9f97ac20e9aa9861797de73e3fcdef3305f7369c"} Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.885070 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:34:10 crc kubenswrapper[4565]: I1125 09:34:10.889601 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" event={"ID":"1ef630cb-2220-41f5-8a3d-66a2a78ce0ce","Type":"ContainerStarted","Data":"7815e037fd8202db741565c023cb49285c45b8c9cede153ff88cfede37b81054"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.098621 4565 scope.go:117] "RemoveContainer" containerID="bb38ed534609c8ae37fa41f78e13717faad602d45197ced0da18ec5542675416" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.098916 4565 scope.go:117] "RemoveContainer" containerID="2039e083d41f8bbf43c768660ed310c591665d3d19066508b5e04327f665da82" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.099370 4565 scope.go:117] "RemoveContainer" containerID="aa3c698b68b2bd17f9b55cc1a21ffe612f175d7f6a374f7bc30e98d6378fffdb" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.099629 4565 scope.go:117] "RemoveContainer" containerID="0720d56d8209612e22c0b8e051d898758d5c194ec89bb5c1df3433b39981bc9b" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.099876 4565 scope.go:117] "RemoveContainer" containerID="6212e62c68bacd6c61b1d6bd2a7998b9eae5e36bb751573695de237805328fbb" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.100342 4565 scope.go:117] "RemoveContainer" containerID="3aad156cb91394097a90d86471f024bc6360f0107472880b3fc9d7c39d20c713" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.100988 4565 scope.go:117] "RemoveContainer" containerID="0bd57640bc36417923138c806a0ddf67d2cce1850e4e8017df66f645df8325a0" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.901132 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" event={"ID":"a933a688-5393-4b7b-b0b7-6ee5791970b1","Type":"ContainerStarted","Data":"549ac725f29dfdca01bf7cbd221fb69cac36fb417e64ade6128dd96869185d27"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.901721 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.906887 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" event={"ID":"d0ef0237-045a-4153-a377-07b2c9e6ceba","Type":"ContainerStarted","Data":"983770c98a2e1595b5646d9481defdefc9a33f976596905162e31be7988cc04f"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.907252 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.910072 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" event={"ID":"d5be161b-0f0c-485e-b1c7-50a9fff4b053","Type":"ContainerStarted","Data":"9b4b139991d0099ac736ffdea28bdd10a37936d7f3dd7b2f925aa9324ae2f7c2"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.910419 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.911812 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" event={"ID":"f35f4446-328e-40d3-96d6-2bc814fb8a96","Type":"ContainerStarted","Data":"0c929821e212c962389e3515aaaabdb169676a73b4b180d9e75f1925b6cf16af"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.912002 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.914019 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" event={"ID":"873884b1-6ee8-400c-9ca2-0b0b3c4618e9","Type":"ContainerStarted","Data":"39eb493a01f6ae82a38fe4013cc2cb91c77ade14510ed9b6762eb9f0f9db2bc2"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.914530 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.916996 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" event={"ID":"cf68120a-e894-4189-8035-91f8045618c0","Type":"ContainerStarted","Data":"38e696396ecc961d197a1c9948cf23489c02ba0b631679225df4b1b7f198d467"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.917513 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.922177 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" event={"ID":"31dbf471-6fab-4ddd-a384-e4dd5335d5dc","Type":"ContainerStarted","Data":"b9010c2db03cef7791d9d659739e2507bf4bfd4023f858ca5f34885be924fb52"} Nov 25 09:34:11 crc kubenswrapper[4565]: I1125 09:34:11.922662 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.097250 4565 scope.go:117] "RemoveContainer" containerID="f39dcedc4e006fd2db9645f254c4b3ea358db41b2a05716117ef4f9e0271f316" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.097374 4565 scope.go:117] "RemoveContainer" containerID="edc460c87e3dda646dd97574694f175e86617802fe573539d27d9d21dae913fa" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.097508 4565 scope.go:117] "RemoveContainer" containerID="83e42ee54a1224d7b2656ad17410377bc7144f68868f1e208f218d89791aa8d2" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.934259 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" event={"ID":"4ee66804-213d-4e52-b04b-6b00eec8de2d","Type":"ContainerStarted","Data":"a96080666c8e5b01ef602503c0543436fa7b98220ef3314345ead4c737cd7f54"} Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.935064 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.937209 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" event={"ID":"333ae034-2972-4915-a547-364c01510827","Type":"ContainerStarted","Data":"dd33761412a2c6d0b23d73afb0e236ecc942391e153d266dc4fc798f066a4b35"} Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.937820 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:34:12 crc kubenswrapper[4565]: I1125 09:34:12.941418 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" event={"ID":"92be75e0-b60b-4f41-bde1-4f74a4d306e3","Type":"ContainerStarted","Data":"58597d208011dd4010177779680d1e29ace858a26930054fdeaf1e95fbaa5c25"} Nov 25 09:34:13 crc kubenswrapper[4565]: I1125 09:34:13.098013 4565 scope.go:117] "RemoveContainer" containerID="43d2cb4251d1bf6c3249120a7c62ab420056324d129cceca35c85487f5c8e6d6" Nov 25 09:34:13 crc kubenswrapper[4565]: I1125 09:34:13.098827 4565 scope.go:117] "RemoveContainer" containerID="60d8f30480bbc07d9fb2cde42cfe9eaa3c9076fbb3de41f5b2c96706c3bbc4b0" Nov 25 09:34:13 crc kubenswrapper[4565]: I1125 09:34:13.957119 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" event={"ID":"1af57713-55c3-45ec-b98b-1aac75a2d60b","Type":"ContainerStarted","Data":"d262d3bee7b224e6f98bcdf2e92c7afa521d68b7d60474422d15b0f8d83b591f"} Nov 25 09:34:13 crc kubenswrapper[4565]: I1125 09:34:13.958141 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:34:13 crc kubenswrapper[4565]: I1125 09:34:13.959088 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" event={"ID":"3791b99a-d877-470f-8a8f-56f7b02be997","Type":"ContainerStarted","Data":"f5f206c9cdb20cb20ae8a06d2959451043bf0c2607a54531ce3601f747bf0dfd"} Nov 25 09:34:15 crc kubenswrapper[4565]: I1125 09:34:15.100861 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:34:15 crc kubenswrapper[4565]: E1125 09:34:15.101667 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.431999 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-cxwrc" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.493122 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-bd8d6" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.505647 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.527157 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-f9bbj" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.529843 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-t68ww" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.575304 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-2s9lf" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.645119 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-mjsqx" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.844554 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-lz6zt" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.875890 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-pcqxq" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.902854 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-2gkww" Nov 25 09:34:17 crc kubenswrapper[4565]: I1125 09:34:17.996173 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-pzd74" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.007133 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n9bdd" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.173919 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-hrr6t" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.214244 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-2q9rf" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.225065 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-zz6wf" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.273115 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-kgn59" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.327893 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-zl2jr" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.372405 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.378360 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-7dzx4" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.394086 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:34:18 crc kubenswrapper[4565]: I1125 09:34:18.422814 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-v2c96" Nov 25 09:34:26 crc kubenswrapper[4565]: I1125 09:34:26.664403 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74454849f9-fjwfp" Nov 25 09:34:27 crc kubenswrapper[4565]: I1125 09:34:27.480285 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-ddlth" Nov 25 09:34:30 crc kubenswrapper[4565]: I1125 09:34:30.098211 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:34:31 crc kubenswrapper[4565]: I1125 09:34:31.116196 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53"} Nov 25 09:35:02 crc kubenswrapper[4565]: I1125 09:35:02.965845 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5"] Nov 25 09:35:02 crc kubenswrapper[4565]: I1125 09:35:02.983640 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn"] Nov 25 09:35:02 crc kubenswrapper[4565]: I1125 09:35:02.991852 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.003880 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.008347 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.013563 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.018356 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6vnj9"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.023201 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.028190 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sfksn"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.033165 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.037852 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gjznj"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.042856 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c6hz5"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.047804 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zj2st"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.052476 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r68v5"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.057765 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.063757 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-689ff"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.070387 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gjznj"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.075544 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpc87"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.080824 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p7bfw"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.086049 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zcg5w"] Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.106744 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac9f2c5-d89d-44d7-8138-875852e7c565" path="/var/lib/kubelet/pods/0ac9f2c5-d89d-44d7-8138-875852e7c565/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.107319 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2574f63c-b7b2-4edf-a3f3-808a81438878" path="/var/lib/kubelet/pods/2574f63c-b7b2-4edf-a3f3-808a81438878/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.107823 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da40e24-5d26-4a34-a822-67953bfe3207" path="/var/lib/kubelet/pods/3da40e24-5d26-4a34-a822-67953bfe3207/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.108342 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68414e6d-0dc0-43ee-9273-6b4fc3e43563" path="/var/lib/kubelet/pods/68414e6d-0dc0-43ee-9273-6b4fc3e43563/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.110156 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ac68ea-c3a1-427c-b07a-1e1e5bc6e466" path="/var/lib/kubelet/pods/70ac68ea-c3a1-427c-b07a-1e1e5bc6e466/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.110656 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70daaa2e-42b4-4cf4-9b5a-cb44a6293db8" path="/var/lib/kubelet/pods/70daaa2e-42b4-4cf4-9b5a-cb44a6293db8/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.111196 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe" path="/var/lib/kubelet/pods/8d55d1ac-a9f7-47d7-8bfc-acbeadbcfffe/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.112168 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a799faaf-4d07-46a0-8b5b-ffa7af279ab7" path="/var/lib/kubelet/pods/a799faaf-4d07-46a0-8b5b-ffa7af279ab7/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.112662 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d461ae1d-1d53-43af-830d-a4e301627bf9" path="/var/lib/kubelet/pods/d461ae1d-1d53-43af-830d-a4e301627bf9/volumes" Nov 25 09:35:03 crc kubenswrapper[4565]: I1125 09:35:03.113175 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6043a07-8d34-4d17-84b7-71fcd378f0b6" path="/var/lib/kubelet/pods/f6043a07-8d34-4d17-84b7-71fcd378f0b6/volumes" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.307267 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw"] Nov 25 09:35:15 crc kubenswrapper[4565]: E1125 09:35:15.308478 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" containerName="installer" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.308496 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" containerName="installer" Nov 25 09:35:15 crc kubenswrapper[4565]: E1125 09:35:15.308531 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.308537 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.308734 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.308757 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3fe2ce-4759-4160-b326-4f1e15679e96" containerName="installer" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.309424 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.316908 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.317190 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.317510 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.318855 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.321835 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.359842 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw"] Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.379134 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.379188 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.379309 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.379409 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gkn\" (UniqueName: \"kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.379441 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.482006 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.482197 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85gkn\" (UniqueName: \"kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.482634 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.483773 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.483828 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.491552 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.493343 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.493392 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.493741 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.505186 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gkn\" (UniqueName: \"kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:15 crc kubenswrapper[4565]: I1125 09:35:15.636746 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:16 crc kubenswrapper[4565]: I1125 09:35:16.256456 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw"] Nov 25 09:35:16 crc kubenswrapper[4565]: W1125 09:35:16.268535 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29506cc_593c_49fd_b8eb_0ec1c0c8be5b.slice/crio-a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157 WatchSource:0}: Error finding container a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157: Status 404 returned error can't find the container with id a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157 Nov 25 09:35:16 crc kubenswrapper[4565]: I1125 09:35:16.496237 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" event={"ID":"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b","Type":"ContainerStarted","Data":"a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157"} Nov 25 09:35:17 crc kubenswrapper[4565]: I1125 09:35:17.507780 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" event={"ID":"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b","Type":"ContainerStarted","Data":"4953ab3d88b06937dfec0d4780fbaa97461dc64c64d50de2f77b8cb745b72af0"} Nov 25 09:35:17 crc kubenswrapper[4565]: I1125 09:35:17.534053 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" podStartSLOduration=1.989958493 podStartE2EDuration="2.534033961s" podCreationTimestamp="2025-11-25 09:35:15 +0000 UTC" firstStartedPulling="2025-11-25 09:35:16.271159724 +0000 UTC m=+1849.473654861" lastFinishedPulling="2025-11-25 09:35:16.815235191 +0000 UTC m=+1850.017730329" observedRunningTime="2025-11-25 09:35:17.532379092 +0000 UTC m=+1850.734874229" watchObservedRunningTime="2025-11-25 09:35:17.534033961 +0000 UTC m=+1850.736529099" Nov 25 09:35:26 crc kubenswrapper[4565]: I1125 09:35:26.592273 4565 generic.go:334] "Generic (PLEG): container finished" podID="e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" containerID="4953ab3d88b06937dfec0d4780fbaa97461dc64c64d50de2f77b8cb745b72af0" exitCode=0 Nov 25 09:35:26 crc kubenswrapper[4565]: I1125 09:35:26.592346 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" event={"ID":"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b","Type":"ContainerDied","Data":"4953ab3d88b06937dfec0d4780fbaa97461dc64c64d50de2f77b8cb745b72af0"} Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.050578 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.199764 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph\") pod \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.199837 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle\") pod \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.199880 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85gkn\" (UniqueName: \"kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn\") pod \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.200020 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key\") pod \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.200759 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory\") pod \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\" (UID: \"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b\") " Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.206767 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph" (OuterVolumeSpecName: "ceph") pod "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" (UID: "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.206892 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn" (OuterVolumeSpecName: "kube-api-access-85gkn") pod "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" (UID: "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b"). InnerVolumeSpecName "kube-api-access-85gkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.208171 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" (UID: "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.230087 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory" (OuterVolumeSpecName: "inventory") pod "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" (UID: "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.233186 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" (UID: "e29506cc-593c-49fd-b8eb-0ec1c0c8be5b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.304045 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.304080 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.304095 4565 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.304113 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85gkn\" (UniqueName: \"kubernetes.io/projected/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-kube-api-access-85gkn\") on node \"crc\" DevicePath \"\"" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.304125 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e29506cc-593c-49fd-b8eb-0ec1c0c8be5b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.617336 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" event={"ID":"e29506cc-593c-49fd-b8eb-0ec1c0c8be5b","Type":"ContainerDied","Data":"a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157"} Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.617405 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88c7a5c3a42fdd5bf17c619daa49c7aa78c770d50f87c8a54dca6fee3100157" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.617412 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.710452 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f"] Nov 25 09:35:28 crc kubenswrapper[4565]: E1125 09:35:28.710824 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.710845 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.711033 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29506cc-593c-49fd-b8eb-0ec1c0c8be5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.711664 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.716052 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.716266 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.717072 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.717090 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.717122 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.732335 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f"] Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.812976 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.813069 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9scdb\" (UniqueName: \"kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.813113 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.813138 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.813290 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.915333 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.915746 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9scdb\" (UniqueName: \"kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.915813 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.915846 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.916105 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.920363 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.920875 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.921024 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.922381 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:28 crc kubenswrapper[4565]: I1125 09:35:28.930663 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9scdb\" (UniqueName: \"kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:29 crc kubenswrapper[4565]: I1125 09:35:29.025823 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:35:29 crc kubenswrapper[4565]: I1125 09:35:29.545274 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f"] Nov 25 09:35:29 crc kubenswrapper[4565]: I1125 09:35:29.626199 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" event={"ID":"3de99810-1335-466f-8386-e4ecbe49f3fd","Type":"ContainerStarted","Data":"03e98523b1a7fc219df1558fba05a3bbec08c0593f55c2c21e929449e10928e2"} Nov 25 09:35:30 crc kubenswrapper[4565]: I1125 09:35:30.657805 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" event={"ID":"3de99810-1335-466f-8386-e4ecbe49f3fd","Type":"ContainerStarted","Data":"8787b14f72360d7fa7bcb850102436dca1425b622635c2df612d0d35e81cd233"} Nov 25 09:35:30 crc kubenswrapper[4565]: I1125 09:35:30.677503 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" podStartSLOduration=2.157611754 podStartE2EDuration="2.677481578s" podCreationTimestamp="2025-11-25 09:35:28 +0000 UTC" firstStartedPulling="2025-11-25 09:35:29.553717183 +0000 UTC m=+1862.756212322" lastFinishedPulling="2025-11-25 09:35:30.073587007 +0000 UTC m=+1863.276082146" observedRunningTime="2025-11-25 09:35:30.673612013 +0000 UTC m=+1863.876107151" watchObservedRunningTime="2025-11-25 09:35:30.677481578 +0000 UTC m=+1863.879976716" Nov 25 09:35:35 crc kubenswrapper[4565]: I1125 09:35:35.961307 4565 scope.go:117] "RemoveContainer" containerID="72ac22149b770785f726966d3b3faf8bc91935ea780cf8513c3ed9a3e05794a4" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.023625 4565 scope.go:117] "RemoveContainer" containerID="51a83edc25335af15bccc91e7ef4e9040c9264d323d232d9be0d7c6b02429ded" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.055534 4565 scope.go:117] "RemoveContainer" containerID="3b6cb2ae6cfb621acc892b728bfdb02fea77c6cdc477a9198130152f57e41fcd" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.152056 4565 scope.go:117] "RemoveContainer" containerID="216935a3513cec14380d02f4c7b97d9512d00f0ba5456773a1df1cf2e8b34651" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.183407 4565 scope.go:117] "RemoveContainer" containerID="50c07b19060375a1c048d913967518c2e8da6a677e127af3bd7b4138228d1660" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.252782 4565 scope.go:117] "RemoveContainer" containerID="3a7b58c76992e9ac43411a2c966c07e745cd234be0fa4cd96d543e695fdd3a64" Nov 25 09:35:36 crc kubenswrapper[4565]: I1125 09:35:36.326873 4565 scope.go:117] "RemoveContainer" containerID="ff3cba882720548764d9b2963520e20956b57f03b7f6243fbdc164c580cea380" Nov 25 09:36:36 crc kubenswrapper[4565]: I1125 09:36:36.451288 4565 scope.go:117] "RemoveContainer" containerID="e888ae7606b63e7c323b04ea176dbf72d037d6fd3f710b8424aa80f24de7555d" Nov 25 09:36:36 crc kubenswrapper[4565]: I1125 09:36:36.499436 4565 scope.go:117] "RemoveContainer" containerID="088dea403f5a74872d48e5d0262b778d3613d565de11bf54a5a78afe49847ed2" Nov 25 09:36:36 crc kubenswrapper[4565]: I1125 09:36:36.538201 4565 scope.go:117] "RemoveContainer" containerID="43c4ff9bf5adca0cb9b01593e8a485280d1f9c8e33dd39486e7624eba6119c25" Nov 25 09:36:55 crc kubenswrapper[4565]: I1125 09:36:55.100123 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:36:55 crc kubenswrapper[4565]: I1125 09:36:55.100716 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:37:11 crc kubenswrapper[4565]: I1125 09:37:11.571247 4565 generic.go:334] "Generic (PLEG): container finished" podID="3de99810-1335-466f-8386-e4ecbe49f3fd" containerID="8787b14f72360d7fa7bcb850102436dca1425b622635c2df612d0d35e81cd233" exitCode=0 Nov 25 09:37:11 crc kubenswrapper[4565]: I1125 09:37:11.571349 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" event={"ID":"3de99810-1335-466f-8386-e4ecbe49f3fd","Type":"ContainerDied","Data":"8787b14f72360d7fa7bcb850102436dca1425b622635c2df612d0d35e81cd233"} Nov 25 09:37:12 crc kubenswrapper[4565]: I1125 09:37:12.963009 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.133356 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle\") pod \"3de99810-1335-466f-8386-e4ecbe49f3fd\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.133744 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key\") pod \"3de99810-1335-466f-8386-e4ecbe49f3fd\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.133977 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9scdb\" (UniqueName: \"kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb\") pod \"3de99810-1335-466f-8386-e4ecbe49f3fd\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.134056 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory\") pod \"3de99810-1335-466f-8386-e4ecbe49f3fd\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.134127 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph\") pod \"3de99810-1335-466f-8386-e4ecbe49f3fd\" (UID: \"3de99810-1335-466f-8386-e4ecbe49f3fd\") " Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.140261 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb" (OuterVolumeSpecName: "kube-api-access-9scdb") pod "3de99810-1335-466f-8386-e4ecbe49f3fd" (UID: "3de99810-1335-466f-8386-e4ecbe49f3fd"). InnerVolumeSpecName "kube-api-access-9scdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.142138 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3de99810-1335-466f-8386-e4ecbe49f3fd" (UID: "3de99810-1335-466f-8386-e4ecbe49f3fd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.142126 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph" (OuterVolumeSpecName: "ceph") pod "3de99810-1335-466f-8386-e4ecbe49f3fd" (UID: "3de99810-1335-466f-8386-e4ecbe49f3fd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.155618 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory" (OuterVolumeSpecName: "inventory") pod "3de99810-1335-466f-8386-e4ecbe49f3fd" (UID: "3de99810-1335-466f-8386-e4ecbe49f3fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.156546 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3de99810-1335-466f-8386-e4ecbe49f3fd" (UID: "3de99810-1335-466f-8386-e4ecbe49f3fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.237568 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9scdb\" (UniqueName: \"kubernetes.io/projected/3de99810-1335-466f-8386-e4ecbe49f3fd-kube-api-access-9scdb\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.237688 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.237764 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.237824 4565 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.237876 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3de99810-1335-466f-8386-e4ecbe49f3fd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.595638 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" event={"ID":"3de99810-1335-466f-8386-e4ecbe49f3fd","Type":"ContainerDied","Data":"03e98523b1a7fc219df1558fba05a3bbec08c0593f55c2c21e929449e10928e2"} Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.595690 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.595695 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e98523b1a7fc219df1558fba05a3bbec08c0593f55c2c21e929449e10928e2" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.668865 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8"] Nov 25 09:37:13 crc kubenswrapper[4565]: E1125 09:37:13.669342 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de99810-1335-466f-8386-e4ecbe49f3fd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.669362 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de99810-1335-466f-8386-e4ecbe49f3fd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.669575 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de99810-1335-466f-8386-e4ecbe49f3fd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.670294 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.677212 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.677648 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.677978 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.678185 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.681542 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8"] Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.685037 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.847964 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhht\" (UniqueName: \"kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.848705 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.849158 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.849325 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.951020 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhht\" (UniqueName: \"kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.951076 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.951153 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.951178 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.954724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.964699 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.964963 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.967099 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhht\" (UniqueName: \"kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lggx8\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:13 crc kubenswrapper[4565]: I1125 09:37:13.998502 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:14 crc kubenswrapper[4565]: I1125 09:37:14.457438 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8"] Nov 25 09:37:14 crc kubenswrapper[4565]: I1125 09:37:14.607786 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" event={"ID":"13f5faf5-45eb-46fc-b76b-59b8babba10c","Type":"ContainerStarted","Data":"ac9a2f97b0b009e4e2cacad0bf00234067a94215ec37200f18a526b3b79d585d"} Nov 25 09:37:15 crc kubenswrapper[4565]: I1125 09:37:15.620105 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" event={"ID":"13f5faf5-45eb-46fc-b76b-59b8babba10c","Type":"ContainerStarted","Data":"60de986243ed13e609af6821427e472909a00e40d7650194fdfd340f3f52c192"} Nov 25 09:37:15 crc kubenswrapper[4565]: I1125 09:37:15.639429 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" podStartSLOduration=2.156754736 podStartE2EDuration="2.639405592s" podCreationTimestamp="2025-11-25 09:37:13 +0000 UTC" firstStartedPulling="2025-11-25 09:37:14.460909681 +0000 UTC m=+1967.663404810" lastFinishedPulling="2025-11-25 09:37:14.943560528 +0000 UTC m=+1968.146055666" observedRunningTime="2025-11-25 09:37:15.632011262 +0000 UTC m=+1968.834506400" watchObservedRunningTime="2025-11-25 09:37:15.639405592 +0000 UTC m=+1968.841900730" Nov 25 09:37:25 crc kubenswrapper[4565]: I1125 09:37:25.099287 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:37:25 crc kubenswrapper[4565]: I1125 09:37:25.100526 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:37:35 crc kubenswrapper[4565]: I1125 09:37:35.813704 4565 generic.go:334] "Generic (PLEG): container finished" podID="13f5faf5-45eb-46fc-b76b-59b8babba10c" containerID="60de986243ed13e609af6821427e472909a00e40d7650194fdfd340f3f52c192" exitCode=0 Nov 25 09:37:35 crc kubenswrapper[4565]: I1125 09:37:35.813941 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" event={"ID":"13f5faf5-45eb-46fc-b76b-59b8babba10c","Type":"ContainerDied","Data":"60de986243ed13e609af6821427e472909a00e40d7650194fdfd340f3f52c192"} Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.171083 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.313657 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key\") pod \"13f5faf5-45eb-46fc-b76b-59b8babba10c\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.313894 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory\") pod \"13f5faf5-45eb-46fc-b76b-59b8babba10c\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.314202 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph\") pod \"13f5faf5-45eb-46fc-b76b-59b8babba10c\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.314252 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwhht\" (UniqueName: \"kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht\") pod \"13f5faf5-45eb-46fc-b76b-59b8babba10c\" (UID: \"13f5faf5-45eb-46fc-b76b-59b8babba10c\") " Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.319974 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht" (OuterVolumeSpecName: "kube-api-access-jwhht") pod "13f5faf5-45eb-46fc-b76b-59b8babba10c" (UID: "13f5faf5-45eb-46fc-b76b-59b8babba10c"). InnerVolumeSpecName "kube-api-access-jwhht". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.320747 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph" (OuterVolumeSpecName: "ceph") pod "13f5faf5-45eb-46fc-b76b-59b8babba10c" (UID: "13f5faf5-45eb-46fc-b76b-59b8babba10c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.338279 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory" (OuterVolumeSpecName: "inventory") pod "13f5faf5-45eb-46fc-b76b-59b8babba10c" (UID: "13f5faf5-45eb-46fc-b76b-59b8babba10c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.340397 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13f5faf5-45eb-46fc-b76b-59b8babba10c" (UID: "13f5faf5-45eb-46fc-b76b-59b8babba10c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.418162 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.418232 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.418291 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f5faf5-45eb-46fc-b76b-59b8babba10c-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.418347 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwhht\" (UniqueName: \"kubernetes.io/projected/13f5faf5-45eb-46fc-b76b-59b8babba10c-kube-api-access-jwhht\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.835023 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" event={"ID":"13f5faf5-45eb-46fc-b76b-59b8babba10c","Type":"ContainerDied","Data":"ac9a2f97b0b009e4e2cacad0bf00234067a94215ec37200f18a526b3b79d585d"} Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.835291 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lggx8" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.835403 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac9a2f97b0b009e4e2cacad0bf00234067a94215ec37200f18a526b3b79d585d" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.906581 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9"] Nov 25 09:37:37 crc kubenswrapper[4565]: E1125 09:37:37.907088 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f5faf5-45eb-46fc-b76b-59b8babba10c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.907112 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f5faf5-45eb-46fc-b76b-59b8babba10c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.907339 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f5faf5-45eb-46fc-b76b-59b8babba10c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.908045 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.911461 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.911473 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.911466 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.911692 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.915502 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.923849 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9"] Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.930375 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.930462 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.930562 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fww8x\" (UniqueName: \"kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:37 crc kubenswrapper[4565]: I1125 09:37:37.930747 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.032606 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.032818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.032951 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.033066 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fww8x\" (UniqueName: \"kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.037490 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.041871 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.049549 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.051583 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fww8x\" (UniqueName: \"kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.221364 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.722436 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9"] Nov 25 09:37:38 crc kubenswrapper[4565]: I1125 09:37:38.845243 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" event={"ID":"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e","Type":"ContainerStarted","Data":"738c40e5cb89c5367698d6c8d39ceec6d1031baecf5df2f18f9432b4decc9838"} Nov 25 09:37:39 crc kubenswrapper[4565]: I1125 09:37:39.856217 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" event={"ID":"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e","Type":"ContainerStarted","Data":"487f9ea199569eb65a91ea5753840e4aebeb6498f98fc5d4e6d6602be40079c5"} Nov 25 09:37:39 crc kubenswrapper[4565]: I1125 09:37:39.881149 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" podStartSLOduration=2.263360047 podStartE2EDuration="2.881131771s" podCreationTimestamp="2025-11-25 09:37:37 +0000 UTC" firstStartedPulling="2025-11-25 09:37:38.729966786 +0000 UTC m=+1991.932461925" lastFinishedPulling="2025-11-25 09:37:39.347738512 +0000 UTC m=+1992.550233649" observedRunningTime="2025-11-25 09:37:39.871414151 +0000 UTC m=+1993.073909289" watchObservedRunningTime="2025-11-25 09:37:39.881131771 +0000 UTC m=+1993.083626909" Nov 25 09:37:43 crc kubenswrapper[4565]: I1125 09:37:43.893335 4565 generic.go:334] "Generic (PLEG): container finished" podID="7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" containerID="487f9ea199569eb65a91ea5753840e4aebeb6498f98fc5d4e6d6602be40079c5" exitCode=0 Nov 25 09:37:43 crc kubenswrapper[4565]: I1125 09:37:43.893430 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" event={"ID":"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e","Type":"ContainerDied","Data":"487f9ea199569eb65a91ea5753840e4aebeb6498f98fc5d4e6d6602be40079c5"} Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.302838 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.485189 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph\") pod \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.485234 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory\") pod \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.485254 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key\") pod \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.485275 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fww8x\" (UniqueName: \"kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x\") pod \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\" (UID: \"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e\") " Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.491514 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x" (OuterVolumeSpecName: "kube-api-access-fww8x") pod "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" (UID: "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e"). InnerVolumeSpecName "kube-api-access-fww8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.491693 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph" (OuterVolumeSpecName: "ceph") pod "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" (UID: "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.511090 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory" (OuterVolumeSpecName: "inventory") pod "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" (UID: "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.512402 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" (UID: "7b9ebd21-0421-42f3-a7e6-8f0038b8c07e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.588114 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.588154 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.588169 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.588180 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fww8x\" (UniqueName: \"kubernetes.io/projected/7b9ebd21-0421-42f3-a7e6-8f0038b8c07e-kube-api-access-fww8x\") on node \"crc\" DevicePath \"\"" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.915189 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" event={"ID":"7b9ebd21-0421-42f3-a7e6-8f0038b8c07e","Type":"ContainerDied","Data":"738c40e5cb89c5367698d6c8d39ceec6d1031baecf5df2f18f9432b4decc9838"} Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.915240 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="738c40e5cb89c5367698d6c8d39ceec6d1031baecf5df2f18f9432b4decc9838" Nov 25 09:37:45 crc kubenswrapper[4565]: I1125 09:37:45.915304 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.033740 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j"] Nov 25 09:37:46 crc kubenswrapper[4565]: E1125 09:37:46.034160 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.034181 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.034365 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ebd21-0421-42f3-a7e6-8f0038b8c07e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.034990 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.037610 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.037631 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.038029 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.038950 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.039042 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.054804 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j"] Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.201407 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.201460 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.201504 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmqm\" (UniqueName: \"kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.201534 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.303883 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.304220 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.304303 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmqm\" (UniqueName: \"kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.304358 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.308065 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.312358 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.314395 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.319408 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmqm\" (UniqueName: \"kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h9t4j\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.354510 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.826764 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j"] Nov 25 09:37:46 crc kubenswrapper[4565]: I1125 09:37:46.926041 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" event={"ID":"766382aa-dcdb-41f0-afb0-cb90d5ea8f31","Type":"ContainerStarted","Data":"4cdd15803df150ebc264ccecd999de3db508db0508714b8b1806e22a5694734c"} Nov 25 09:37:47 crc kubenswrapper[4565]: I1125 09:37:47.938866 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" event={"ID":"766382aa-dcdb-41f0-afb0-cb90d5ea8f31","Type":"ContainerStarted","Data":"fcd0f13c742681721683ac5e5a35b97f146f65d0396811f57359a65ee66f667b"} Nov 25 09:37:47 crc kubenswrapper[4565]: I1125 09:37:47.963960 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" podStartSLOduration=1.49648385 podStartE2EDuration="1.963916126s" podCreationTimestamp="2025-11-25 09:37:46 +0000 UTC" firstStartedPulling="2025-11-25 09:37:46.833581281 +0000 UTC m=+2000.036076419" lastFinishedPulling="2025-11-25 09:37:47.301013557 +0000 UTC m=+2000.503508695" observedRunningTime="2025-11-25 09:37:47.95217444 +0000 UTC m=+2001.154669578" watchObservedRunningTime="2025-11-25 09:37:47.963916126 +0000 UTC m=+2001.166411265" Nov 25 09:37:55 crc kubenswrapper[4565]: I1125 09:37:55.099121 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:37:55 crc kubenswrapper[4565]: I1125 09:37:55.100009 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:37:55 crc kubenswrapper[4565]: I1125 09:37:55.107474 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:37:55 crc kubenswrapper[4565]: I1125 09:37:55.108466 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:37:55 crc kubenswrapper[4565]: I1125 09:37:55.108544 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53" gracePeriod=600 Nov 25 09:37:56 crc kubenswrapper[4565]: I1125 09:37:56.015503 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53" exitCode=0 Nov 25 09:37:56 crc kubenswrapper[4565]: I1125 09:37:56.015575 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53"} Nov 25 09:37:56 crc kubenswrapper[4565]: I1125 09:37:56.016290 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843"} Nov 25 09:37:56 crc kubenswrapper[4565]: I1125 09:37:56.016336 4565 scope.go:117] "RemoveContainer" containerID="892eccd43c68bfe273b6084d6415256b56d74cf02c163c71dedd8daed8150b3d" Nov 25 09:38:18 crc kubenswrapper[4565]: I1125 09:38:18.229664 4565 generic.go:334] "Generic (PLEG): container finished" podID="766382aa-dcdb-41f0-afb0-cb90d5ea8f31" containerID="fcd0f13c742681721683ac5e5a35b97f146f65d0396811f57359a65ee66f667b" exitCode=0 Nov 25 09:38:18 crc kubenswrapper[4565]: I1125 09:38:18.229755 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" event={"ID":"766382aa-dcdb-41f0-afb0-cb90d5ea8f31","Type":"ContainerDied","Data":"fcd0f13c742681721683ac5e5a35b97f146f65d0396811f57359a65ee66f667b"} Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.710353 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.815830 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key\") pod \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.816035 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph\") pod \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.816109 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmqm\" (UniqueName: \"kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm\") pod \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.816203 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory\") pod \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\" (UID: \"766382aa-dcdb-41f0-afb0-cb90d5ea8f31\") " Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.822078 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph" (OuterVolumeSpecName: "ceph") pod "766382aa-dcdb-41f0-afb0-cb90d5ea8f31" (UID: "766382aa-dcdb-41f0-afb0-cb90d5ea8f31"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.826494 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm" (OuterVolumeSpecName: "kube-api-access-dnmqm") pod "766382aa-dcdb-41f0-afb0-cb90d5ea8f31" (UID: "766382aa-dcdb-41f0-afb0-cb90d5ea8f31"). InnerVolumeSpecName "kube-api-access-dnmqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.842086 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "766382aa-dcdb-41f0-afb0-cb90d5ea8f31" (UID: "766382aa-dcdb-41f0-afb0-cb90d5ea8f31"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.847206 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory" (OuterVolumeSpecName: "inventory") pod "766382aa-dcdb-41f0-afb0-cb90d5ea8f31" (UID: "766382aa-dcdb-41f0-afb0-cb90d5ea8f31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.919342 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.919380 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmqm\" (UniqueName: \"kubernetes.io/projected/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-kube-api-access-dnmqm\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.919396 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:19 crc kubenswrapper[4565]: I1125 09:38:19.919405 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/766382aa-dcdb-41f0-afb0-cb90d5ea8f31-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.250656 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" event={"ID":"766382aa-dcdb-41f0-afb0-cb90d5ea8f31","Type":"ContainerDied","Data":"4cdd15803df150ebc264ccecd999de3db508db0508714b8b1806e22a5694734c"} Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.250706 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cdd15803df150ebc264ccecd999de3db508db0508714b8b1806e22a5694734c" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.251147 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h9t4j" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.347050 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v"] Nov 25 09:38:20 crc kubenswrapper[4565]: E1125 09:38:20.348005 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766382aa-dcdb-41f0-afb0-cb90d5ea8f31" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.348032 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="766382aa-dcdb-41f0-afb0-cb90d5ea8f31" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.348296 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="766382aa-dcdb-41f0-afb0-cb90d5ea8f31" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.349170 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.356910 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.356988 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.356987 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.357255 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.357528 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.373676 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v"] Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.429971 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.430145 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.430274 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4jb\" (UniqueName: \"kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.430651 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.531870 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.532084 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.532341 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.532560 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4jb\" (UniqueName: \"kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.536864 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.541672 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.542185 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.553338 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4jb\" (UniqueName: \"kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:20 crc kubenswrapper[4565]: I1125 09:38:20.675472 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:21 crc kubenswrapper[4565]: I1125 09:38:21.191872 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v"] Nov 25 09:38:21 crc kubenswrapper[4565]: I1125 09:38:21.258789 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" event={"ID":"6fcb5ee3-1258-4245-bd22-5aecd14a312c","Type":"ContainerStarted","Data":"64b9a87e4afdcd31690b9b293a7044117de2fe8f3f26c9b1ee41db7e3c6b5540"} Nov 25 09:38:22 crc kubenswrapper[4565]: I1125 09:38:22.273044 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" event={"ID":"6fcb5ee3-1258-4245-bd22-5aecd14a312c","Type":"ContainerStarted","Data":"7a9530dc4850e915373067280c39a952f1b255382123ed5b34474702d9d93c8c"} Nov 25 09:38:22 crc kubenswrapper[4565]: I1125 09:38:22.298158 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" podStartSLOduration=1.638498529 podStartE2EDuration="2.298131223s" podCreationTimestamp="2025-11-25 09:38:20 +0000 UTC" firstStartedPulling="2025-11-25 09:38:21.200208916 +0000 UTC m=+2034.402704054" lastFinishedPulling="2025-11-25 09:38:21.859841609 +0000 UTC m=+2035.062336748" observedRunningTime="2025-11-25 09:38:22.29095271 +0000 UTC m=+2035.493447848" watchObservedRunningTime="2025-11-25 09:38:22.298131223 +0000 UTC m=+2035.500626351" Nov 25 09:38:25 crc kubenswrapper[4565]: I1125 09:38:25.305331 4565 generic.go:334] "Generic (PLEG): container finished" podID="6fcb5ee3-1258-4245-bd22-5aecd14a312c" containerID="7a9530dc4850e915373067280c39a952f1b255382123ed5b34474702d9d93c8c" exitCode=0 Nov 25 09:38:25 crc kubenswrapper[4565]: I1125 09:38:25.305428 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" event={"ID":"6fcb5ee3-1258-4245-bd22-5aecd14a312c","Type":"ContainerDied","Data":"7a9530dc4850e915373067280c39a952f1b255382123ed5b34474702d9d93c8c"} Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.638330 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.652349 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4jb\" (UniqueName: \"kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb\") pod \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.652461 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory\") pod \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.652555 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key\") pod \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.652820 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph\") pod \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\" (UID: \"6fcb5ee3-1258-4245-bd22-5aecd14a312c\") " Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.659694 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb" (OuterVolumeSpecName: "kube-api-access-wm4jb") pod "6fcb5ee3-1258-4245-bd22-5aecd14a312c" (UID: "6fcb5ee3-1258-4245-bd22-5aecd14a312c"). InnerVolumeSpecName "kube-api-access-wm4jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.666687 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph" (OuterVolumeSpecName: "ceph") pod "6fcb5ee3-1258-4245-bd22-5aecd14a312c" (UID: "6fcb5ee3-1258-4245-bd22-5aecd14a312c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.681775 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6fcb5ee3-1258-4245-bd22-5aecd14a312c" (UID: "6fcb5ee3-1258-4245-bd22-5aecd14a312c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.684865 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory" (OuterVolumeSpecName: "inventory") pod "6fcb5ee3-1258-4245-bd22-5aecd14a312c" (UID: "6fcb5ee3-1258-4245-bd22-5aecd14a312c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.756620 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4jb\" (UniqueName: \"kubernetes.io/projected/6fcb5ee3-1258-4245-bd22-5aecd14a312c-kube-api-access-wm4jb\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.756889 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.756904 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:26 crc kubenswrapper[4565]: I1125 09:38:26.756915 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6fcb5ee3-1258-4245-bd22-5aecd14a312c-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.325073 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" event={"ID":"6fcb5ee3-1258-4245-bd22-5aecd14a312c","Type":"ContainerDied","Data":"64b9a87e4afdcd31690b9b293a7044117de2fe8f3f26c9b1ee41db7e3c6b5540"} Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.325137 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b9a87e4afdcd31690b9b293a7044117de2fe8f3f26c9b1ee41db7e3c6b5540" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.325255 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.406104 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p"] Nov 25 09:38:27 crc kubenswrapper[4565]: E1125 09:38:27.406717 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcb5ee3-1258-4245-bd22-5aecd14a312c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.406747 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcb5ee3-1258-4245-bd22-5aecd14a312c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.407064 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcb5ee3-1258-4245-bd22-5aecd14a312c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.408189 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.410699 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.411016 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.411337 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.411504 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.412102 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.430291 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p"] Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.471779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.471962 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.472132 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8p52\" (UniqueName: \"kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.472365 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.573990 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8p52\" (UniqueName: \"kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.574050 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.574134 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.574167 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.579338 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.579380 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.579863 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.593358 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8p52\" (UniqueName: \"kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jq72p\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:27 crc kubenswrapper[4565]: I1125 09:38:27.723829 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:38:28 crc kubenswrapper[4565]: I1125 09:38:28.208337 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p"] Nov 25 09:38:28 crc kubenswrapper[4565]: W1125 09:38:28.210879 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49dcd4ab_323d_4499_97a6_69fe4e29a0a6.slice/crio-fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0 WatchSource:0}: Error finding container fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0: Status 404 returned error can't find the container with id fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0 Nov 25 09:38:28 crc kubenswrapper[4565]: I1125 09:38:28.335591 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" event={"ID":"49dcd4ab-323d-4499-97a6-69fe4e29a0a6","Type":"ContainerStarted","Data":"fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0"} Nov 25 09:38:29 crc kubenswrapper[4565]: I1125 09:38:29.347219 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" event={"ID":"49dcd4ab-323d-4499-97a6-69fe4e29a0a6","Type":"ContainerStarted","Data":"851527bd0992e6d93ef76eacb0bc8107aa1dee77b6cfea54cfc7f1796fb598fc"} Nov 25 09:38:29 crc kubenswrapper[4565]: I1125 09:38:29.370191 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" podStartSLOduration=1.895738262 podStartE2EDuration="2.370176134s" podCreationTimestamp="2025-11-25 09:38:27 +0000 UTC" firstStartedPulling="2025-11-25 09:38:28.213466976 +0000 UTC m=+2041.415962114" lastFinishedPulling="2025-11-25 09:38:28.687904858 +0000 UTC m=+2041.890399986" observedRunningTime="2025-11-25 09:38:29.363721624 +0000 UTC m=+2042.566216763" watchObservedRunningTime="2025-11-25 09:38:29.370176134 +0000 UTC m=+2042.572671271" Nov 25 09:38:33 crc kubenswrapper[4565]: E1125 09:38:33.544875 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:38:43 crc kubenswrapper[4565]: E1125 09:38:43.756328 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:38:53 crc kubenswrapper[4565]: E1125 09:38:53.979162 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:39:01 crc kubenswrapper[4565]: I1125 09:39:01.637239 4565 generic.go:334] "Generic (PLEG): container finished" podID="49dcd4ab-323d-4499-97a6-69fe4e29a0a6" containerID="851527bd0992e6d93ef76eacb0bc8107aa1dee77b6cfea54cfc7f1796fb598fc" exitCode=0 Nov 25 09:39:01 crc kubenswrapper[4565]: I1125 09:39:01.637326 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" event={"ID":"49dcd4ab-323d-4499-97a6-69fe4e29a0a6","Type":"ContainerDied","Data":"851527bd0992e6d93ef76eacb0bc8107aa1dee77b6cfea54cfc7f1796fb598fc"} Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.053714 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.167418 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph\") pod \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.167602 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8p52\" (UniqueName: \"kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52\") pod \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.167630 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key\") pod \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.167711 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") pod \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.179803 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph" (OuterVolumeSpecName: "ceph") pod "49dcd4ab-323d-4499-97a6-69fe4e29a0a6" (UID: "49dcd4ab-323d-4499-97a6-69fe4e29a0a6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.179984 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52" (OuterVolumeSpecName: "kube-api-access-s8p52") pod "49dcd4ab-323d-4499-97a6-69fe4e29a0a6" (UID: "49dcd4ab-323d-4499-97a6-69fe4e29a0a6"). InnerVolumeSpecName "kube-api-access-s8p52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:03 crc kubenswrapper[4565]: E1125 09:39:03.191739 4565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory podName:49dcd4ab-323d-4499-97a6-69fe4e29a0a6 nodeName:}" failed. No retries permitted until 2025-11-25 09:39:03.691704195 +0000 UTC m=+2076.894199343 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory") pod "49dcd4ab-323d-4499-97a6-69fe4e29a0a6" (UID: "49dcd4ab-323d-4499-97a6-69fe4e29a0a6") : error deleting /var/lib/kubelet/pods/49dcd4ab-323d-4499-97a6-69fe4e29a0a6/volume-subpaths: remove /var/lib/kubelet/pods/49dcd4ab-323d-4499-97a6-69fe4e29a0a6/volume-subpaths: no such file or directory Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.194206 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49dcd4ab-323d-4499-97a6-69fe4e29a0a6" (UID: "49dcd4ab-323d-4499-97a6-69fe4e29a0a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.270138 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.270168 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8p52\" (UniqueName: \"kubernetes.io/projected/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-kube-api-access-s8p52\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.270181 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.635051 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:03 crc kubenswrapper[4565]: E1125 09:39:03.635547 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49dcd4ab-323d-4499-97a6-69fe4e29a0a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.635566 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="49dcd4ab-323d-4499-97a6-69fe4e29a0a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.635766 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="49dcd4ab-323d-4499-97a6-69fe4e29a0a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.637224 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.647549 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.655145 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" event={"ID":"49dcd4ab-323d-4499-97a6-69fe4e29a0a6","Type":"ContainerDied","Data":"fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0"} Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.655187 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc60ae98993617cabd6466311f09a53e848b806e862343e1057929df2e741bf0" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.655194 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jq72p" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.678984 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.679088 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tqm6\" (UniqueName: \"kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.679154 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.751075 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x54cf"] Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.752103 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.763291 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x54cf"] Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.779731 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") pod \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\" (UID: \"49dcd4ab-323d-4499-97a6-69fe4e29a0a6\") " Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.779977 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p56c\" (UniqueName: \"kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780044 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780075 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780106 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780153 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tqm6\" (UniqueName: \"kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780185 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.780256 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.781366 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.781556 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.792539 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory" (OuterVolumeSpecName: "inventory") pod "49dcd4ab-323d-4499-97a6-69fe4e29a0a6" (UID: "49dcd4ab-323d-4499-97a6-69fe4e29a0a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.803741 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tqm6\" (UniqueName: \"kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6\") pod \"community-operators-22x6n\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.881742 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.881850 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p56c\" (UniqueName: \"kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.881910 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.881988 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.882086 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dcd4ab-323d-4499-97a6-69fe4e29a0a6-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.886550 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.886978 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.888366 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.909177 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p56c\" (UniqueName: \"kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c\") pod \"ssh-known-hosts-edpm-deployment-x54cf\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:03 crc kubenswrapper[4565]: I1125 09:39:03.956478 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.073481 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:04 crc kubenswrapper[4565]: E1125 09:39:04.209108 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.432505 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.667994 4565 generic.go:334] "Generic (PLEG): container finished" podID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerID="703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548" exitCode=0 Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.668060 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerDied","Data":"703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548"} Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.668334 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerStarted","Data":"0eaaeae37514570d4337410cbae77eaa53628ac44c7ebea8c2949cd828e64948"} Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.670267 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:39:04 crc kubenswrapper[4565]: I1125 09:39:04.687874 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x54cf"] Nov 25 09:39:04 crc kubenswrapper[4565]: W1125 09:39:04.696358 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff542e2a_3788_42a5_8a29_66f22838511d.slice/crio-413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550 WatchSource:0}: Error finding container 413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550: Status 404 returned error can't find the container with id 413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550 Nov 25 09:39:05 crc kubenswrapper[4565]: I1125 09:39:05.682567 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerStarted","Data":"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4"} Nov 25 09:39:05 crc kubenswrapper[4565]: I1125 09:39:05.689710 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" event={"ID":"ff542e2a-3788-42a5-8a29-66f22838511d","Type":"ContainerStarted","Data":"1b0814a851c787a5dcd9fe32c1fb2bc651953ae2ffb96c01af3c78fa5da66bba"} Nov 25 09:39:05 crc kubenswrapper[4565]: I1125 09:39:05.689776 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" event={"ID":"ff542e2a-3788-42a5-8a29-66f22838511d","Type":"ContainerStarted","Data":"413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550"} Nov 25 09:39:05 crc kubenswrapper[4565]: I1125 09:39:05.723190 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" podStartSLOduration=2.1583803169999998 podStartE2EDuration="2.723173393s" podCreationTimestamp="2025-11-25 09:39:03 +0000 UTC" firstStartedPulling="2025-11-25 09:39:04.698854907 +0000 UTC m=+2077.901350046" lastFinishedPulling="2025-11-25 09:39:05.263647985 +0000 UTC m=+2078.466143122" observedRunningTime="2025-11-25 09:39:05.719606669 +0000 UTC m=+2078.922101806" watchObservedRunningTime="2025-11-25 09:39:05.723173393 +0000 UTC m=+2078.925668530" Nov 25 09:39:06 crc kubenswrapper[4565]: I1125 09:39:06.703279 4565 generic.go:334] "Generic (PLEG): container finished" podID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerID="aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4" exitCode=0 Nov 25 09:39:06 crc kubenswrapper[4565]: I1125 09:39:06.703396 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerDied","Data":"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4"} Nov 25 09:39:07 crc kubenswrapper[4565]: I1125 09:39:07.715675 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerStarted","Data":"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8"} Nov 25 09:39:07 crc kubenswrapper[4565]: I1125 09:39:07.740230 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-22x6n" podStartSLOduration=2.250824167 podStartE2EDuration="4.740211139s" podCreationTimestamp="2025-11-25 09:39:03 +0000 UTC" firstStartedPulling="2025-11-25 09:39:04.670007373 +0000 UTC m=+2077.872502511" lastFinishedPulling="2025-11-25 09:39:07.159394345 +0000 UTC m=+2080.361889483" observedRunningTime="2025-11-25 09:39:07.73619592 +0000 UTC m=+2080.938691057" watchObservedRunningTime="2025-11-25 09:39:07.740211139 +0000 UTC m=+2080.942706278" Nov 25 09:39:12 crc kubenswrapper[4565]: I1125 09:39:12.768691 4565 generic.go:334] "Generic (PLEG): container finished" podID="ff542e2a-3788-42a5-8a29-66f22838511d" containerID="1b0814a851c787a5dcd9fe32c1fb2bc651953ae2ffb96c01af3c78fa5da66bba" exitCode=0 Nov 25 09:39:12 crc kubenswrapper[4565]: I1125 09:39:12.768770 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" event={"ID":"ff542e2a-3788-42a5-8a29-66f22838511d","Type":"ContainerDied","Data":"1b0814a851c787a5dcd9fe32c1fb2bc651953ae2ffb96c01af3c78fa5da66bba"} Nov 25 09:39:13 crc kubenswrapper[4565]: I1125 09:39:13.957457 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:13 crc kubenswrapper[4565]: I1125 09:39:13.957781 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:13 crc kubenswrapper[4565]: I1125 09:39:13.991845 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.187834 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.215810 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p56c\" (UniqueName: \"kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c\") pod \"ff542e2a-3788-42a5-8a29-66f22838511d\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.216066 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam\") pod \"ff542e2a-3788-42a5-8a29-66f22838511d\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.216165 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0\") pod \"ff542e2a-3788-42a5-8a29-66f22838511d\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.216278 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph\") pod \"ff542e2a-3788-42a5-8a29-66f22838511d\" (UID: \"ff542e2a-3788-42a5-8a29-66f22838511d\") " Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.228900 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c" (OuterVolumeSpecName: "kube-api-access-6p56c") pod "ff542e2a-3788-42a5-8a29-66f22838511d" (UID: "ff542e2a-3788-42a5-8a29-66f22838511d"). InnerVolumeSpecName "kube-api-access-6p56c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.234240 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph" (OuterVolumeSpecName: "ceph") pod "ff542e2a-3788-42a5-8a29-66f22838511d" (UID: "ff542e2a-3788-42a5-8a29-66f22838511d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.249750 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ff542e2a-3788-42a5-8a29-66f22838511d" (UID: "ff542e2a-3788-42a5-8a29-66f22838511d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.251467 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff542e2a-3788-42a5-8a29-66f22838511d" (UID: "ff542e2a-3788-42a5-8a29-66f22838511d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.319058 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.319166 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p56c\" (UniqueName: \"kubernetes.io/projected/ff542e2a-3788-42a5-8a29-66f22838511d-kube-api-access-6p56c\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.319237 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.319290 4565 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff542e2a-3788-42a5-8a29-66f22838511d-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:14 crc kubenswrapper[4565]: E1125 09:39:14.532638 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.787157 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" event={"ID":"ff542e2a-3788-42a5-8a29-66f22838511d","Type":"ContainerDied","Data":"413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550"} Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.787507 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413c6e59400b7c2357f3ca7b93fdc1fb288f36fa6dd0b7d0c01e9dc91975d550" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.787400 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x54cf" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.836483 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.866973 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm"] Nov 25 09:39:14 crc kubenswrapper[4565]: E1125 09:39:14.867679 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff542e2a-3788-42a5-8a29-66f22838511d" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.867706 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff542e2a-3788-42a5-8a29-66f22838511d" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.868004 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff542e2a-3788-42a5-8a29-66f22838511d" containerName="ssh-known-hosts-edpm-deployment" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.868877 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.872093 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.873168 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.875776 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.875944 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.876798 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.895595 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm"] Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.915227 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.933397 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.933499 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.933685 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:14 crc kubenswrapper[4565]: I1125 09:39:14.933747 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t7b\" (UniqueName: \"kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.034603 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.034659 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t7b\" (UniqueName: \"kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.034736 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.034785 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.042443 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.043333 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.045940 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.054749 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t7b\" (UniqueName: \"kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kl4bm\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.188725 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.663875 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm"] Nov 25 09:39:15 crc kubenswrapper[4565]: I1125 09:39:15.800173 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" event={"ID":"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a","Type":"ContainerStarted","Data":"5f259932a3841f65992f30abe76bfd5acd1240b1c80f72d264c385ff4d09166d"} Nov 25 09:39:16 crc kubenswrapper[4565]: I1125 09:39:16.808424 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" event={"ID":"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a","Type":"ContainerStarted","Data":"ee4e154ccc5d6a360a83bb4436c9259bef54af1b3f0e5cb122032b318b927a1d"} Nov 25 09:39:16 crc kubenswrapper[4565]: I1125 09:39:16.808797 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-22x6n" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="registry-server" containerID="cri-o://88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8" gracePeriod=2 Nov 25 09:39:16 crc kubenswrapper[4565]: I1125 09:39:16.831351 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" podStartSLOduration=2.326711703 podStartE2EDuration="2.831335334s" podCreationTimestamp="2025-11-25 09:39:14 +0000 UTC" firstStartedPulling="2025-11-25 09:39:15.669892923 +0000 UTC m=+2088.872388062" lastFinishedPulling="2025-11-25 09:39:16.174516554 +0000 UTC m=+2089.377011693" observedRunningTime="2025-11-25 09:39:16.823800388 +0000 UTC m=+2090.026295527" watchObservedRunningTime="2025-11-25 09:39:16.831335334 +0000 UTC m=+2090.033830472" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.257728 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.292073 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities\") pod \"18206c6b-af51-41b2-af35-ca7c5ec22d92\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.292149 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tqm6\" (UniqueName: \"kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6\") pod \"18206c6b-af51-41b2-af35-ca7c5ec22d92\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.292183 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content\") pod \"18206c6b-af51-41b2-af35-ca7c5ec22d92\" (UID: \"18206c6b-af51-41b2-af35-ca7c5ec22d92\") " Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.292757 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities" (OuterVolumeSpecName: "utilities") pod "18206c6b-af51-41b2-af35-ca7c5ec22d92" (UID: "18206c6b-af51-41b2-af35-ca7c5ec22d92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.297525 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6" (OuterVolumeSpecName: "kube-api-access-9tqm6") pod "18206c6b-af51-41b2-af35-ca7c5ec22d92" (UID: "18206c6b-af51-41b2-af35-ca7c5ec22d92"). InnerVolumeSpecName "kube-api-access-9tqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.334197 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18206c6b-af51-41b2-af35-ca7c5ec22d92" (UID: "18206c6b-af51-41b2-af35-ca7c5ec22d92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.394799 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.394846 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tqm6\" (UniqueName: \"kubernetes.io/projected/18206c6b-af51-41b2-af35-ca7c5ec22d92-kube-api-access-9tqm6\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.394874 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18206c6b-af51-41b2-af35-ca7c5ec22d92-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.818229 4565 generic.go:334] "Generic (PLEG): container finished" podID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerID="88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8" exitCode=0 Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.818326 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerDied","Data":"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8"} Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.818401 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22x6n" event={"ID":"18206c6b-af51-41b2-af35-ca7c5ec22d92","Type":"ContainerDied","Data":"0eaaeae37514570d4337410cbae77eaa53628ac44c7ebea8c2949cd828e64948"} Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.818423 4565 scope.go:117] "RemoveContainer" containerID="88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.818348 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22x6n" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.836768 4565 scope.go:117] "RemoveContainer" containerID="aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.857310 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.863041 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-22x6n"] Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.873960 4565 scope.go:117] "RemoveContainer" containerID="703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.922082 4565 scope.go:117] "RemoveContainer" containerID="88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8" Nov 25 09:39:17 crc kubenswrapper[4565]: E1125 09:39:17.922447 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8\": container with ID starting with 88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8 not found: ID does not exist" containerID="88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.922489 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8"} err="failed to get container status \"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8\": rpc error: code = NotFound desc = could not find container \"88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8\": container with ID starting with 88f6c5d2202b2b16e65e152043ce57d4703b8726b140140b2b5f2529393b7ab8 not found: ID does not exist" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.922519 4565 scope.go:117] "RemoveContainer" containerID="aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4" Nov 25 09:39:17 crc kubenswrapper[4565]: E1125 09:39:17.922755 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4\": container with ID starting with aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4 not found: ID does not exist" containerID="aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.922772 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4"} err="failed to get container status \"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4\": rpc error: code = NotFound desc = could not find container \"aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4\": container with ID starting with aa353691a0c6e7dc62a12a90c08fb20d3b0737092aaac16a4fefd1e4e76435f4 not found: ID does not exist" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.922784 4565 scope.go:117] "RemoveContainer" containerID="703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548" Nov 25 09:39:17 crc kubenswrapper[4565]: E1125 09:39:17.922991 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548\": container with ID starting with 703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548 not found: ID does not exist" containerID="703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548" Nov 25 09:39:17 crc kubenswrapper[4565]: I1125 09:39:17.923010 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548"} err="failed to get container status \"703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548\": rpc error: code = NotFound desc = could not find container \"703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548\": container with ID starting with 703ca56a2f1eb15e20fcf6347d4f29f1a494545f594ec5bb27197dadeda7f548 not found: ID does not exist" Nov 25 09:39:19 crc kubenswrapper[4565]: I1125 09:39:19.108059 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" path="/var/lib/kubelet/pods/18206c6b-af51-41b2-af35-ca7c5ec22d92/volumes" Nov 25 09:39:22 crc kubenswrapper[4565]: I1125 09:39:22.879891 4565 generic.go:334] "Generic (PLEG): container finished" podID="41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" containerID="ee4e154ccc5d6a360a83bb4436c9259bef54af1b3f0e5cb122032b318b927a1d" exitCode=0 Nov 25 09:39:22 crc kubenswrapper[4565]: I1125 09:39:22.879991 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" event={"ID":"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a","Type":"ContainerDied","Data":"ee4e154ccc5d6a360a83bb4436c9259bef54af1b3f0e5cb122032b318b927a1d"} Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.307546 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.465212 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key\") pod \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.465644 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory\") pod \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.465966 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph\") pod \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.466213 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9t7b\" (UniqueName: \"kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b\") pod \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\" (UID: \"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a\") " Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.472880 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph" (OuterVolumeSpecName: "ceph") pod "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" (UID: "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.472903 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b" (OuterVolumeSpecName: "kube-api-access-p9t7b") pod "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" (UID: "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a"). InnerVolumeSpecName "kube-api-access-p9t7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.491112 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" (UID: "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.493853 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory" (OuterVolumeSpecName: "inventory") pod "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" (UID: "41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.569148 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9t7b\" (UniqueName: \"kubernetes.io/projected/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-kube-api-access-p9t7b\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.569185 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.569197 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.569210 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:24 crc kubenswrapper[4565]: E1125 09:39:24.768779 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcb5ee3_1258_4245_bd22_5aecd14a312c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.900388 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" event={"ID":"41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a","Type":"ContainerDied","Data":"5f259932a3841f65992f30abe76bfd5acd1240b1c80f72d264c385ff4d09166d"} Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.900780 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f259932a3841f65992f30abe76bfd5acd1240b1c80f72d264c385ff4d09166d" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.900444 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kl4bm" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.990122 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7"] Nov 25 09:39:24 crc kubenswrapper[4565]: E1125 09:39:24.990664 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="extract-utilities" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.990689 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="extract-utilities" Nov 25 09:39:24 crc kubenswrapper[4565]: E1125 09:39:24.990697 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.990705 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:24 crc kubenswrapper[4565]: E1125 09:39:24.990724 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="extract-content" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.990730 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="extract-content" Nov 25 09:39:24 crc kubenswrapper[4565]: E1125 09:39:24.990736 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="registry-server" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.990741 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="registry-server" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.991021 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="18206c6b-af51-41b2-af35-ca7c5ec22d92" containerName="registry-server" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.991069 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.991866 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.997520 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.997689 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.997809 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.997850 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:39:24 crc kubenswrapper[4565]: I1125 09:39:24.997885 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.003336 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7"] Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.185944 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9h8m\" (UniqueName: \"kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.186001 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.186061 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.186223 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.288702 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.288983 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.289225 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9h8m\" (UniqueName: \"kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.289333 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.293498 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.293586 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.295699 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.304841 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9h8m\" (UniqueName: \"kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.309053 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.796971 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7"] Nov 25 09:39:25 crc kubenswrapper[4565]: I1125 09:39:25.911195 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" event={"ID":"dccca567-2d50-4077-8a64-803dafa14ffb","Type":"ContainerStarted","Data":"f672751dd65c57c4f308a11c728884152411738f0c7843a3d5be5b24d7b3090e"} Nov 25 09:39:26 crc kubenswrapper[4565]: I1125 09:39:26.938801 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" event={"ID":"dccca567-2d50-4077-8a64-803dafa14ffb","Type":"ContainerStarted","Data":"49aa9b494224ac3ab2d49369d0a342d14323d8f5f5a46204493b454c76c00dd0"} Nov 25 09:39:26 crc kubenswrapper[4565]: I1125 09:39:26.959597 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" podStartSLOduration=2.449352103 podStartE2EDuration="2.959561574s" podCreationTimestamp="2025-11-25 09:39:24 +0000 UTC" firstStartedPulling="2025-11-25 09:39:25.803738348 +0000 UTC m=+2099.006233486" lastFinishedPulling="2025-11-25 09:39:26.313947819 +0000 UTC m=+2099.516442957" observedRunningTime="2025-11-25 09:39:26.956532183 +0000 UTC m=+2100.159027321" watchObservedRunningTime="2025-11-25 09:39:26.959561574 +0000 UTC m=+2100.162056712" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.004696 4565 generic.go:334] "Generic (PLEG): container finished" podID="dccca567-2d50-4077-8a64-803dafa14ffb" containerID="49aa9b494224ac3ab2d49369d0a342d14323d8f5f5a46204493b454c76c00dd0" exitCode=0 Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.004781 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" event={"ID":"dccca567-2d50-4077-8a64-803dafa14ffb","Type":"ContainerDied","Data":"49aa9b494224ac3ab2d49369d0a342d14323d8f5f5a46204493b454c76c00dd0"} Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.368282 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.370380 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.387439 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.391767 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdq7s\" (UniqueName: \"kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.391863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.392096 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.494316 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdq7s\" (UniqueName: \"kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.494396 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.494498 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.495040 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.495076 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.524099 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdq7s\" (UniqueName: \"kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s\") pod \"redhat-marketplace-mnhgw\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:34 crc kubenswrapper[4565]: I1125 09:39:34.690749 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.148132 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.374674 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.408762 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9h8m\" (UniqueName: \"kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m\") pod \"dccca567-2d50-4077-8a64-803dafa14ffb\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.408921 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph\") pod \"dccca567-2d50-4077-8a64-803dafa14ffb\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.409014 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key\") pod \"dccca567-2d50-4077-8a64-803dafa14ffb\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.409247 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory\") pod \"dccca567-2d50-4077-8a64-803dafa14ffb\" (UID: \"dccca567-2d50-4077-8a64-803dafa14ffb\") " Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.417257 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m" (OuterVolumeSpecName: "kube-api-access-s9h8m") pod "dccca567-2d50-4077-8a64-803dafa14ffb" (UID: "dccca567-2d50-4077-8a64-803dafa14ffb"). InnerVolumeSpecName "kube-api-access-s9h8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.417987 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph" (OuterVolumeSpecName: "ceph") pod "dccca567-2d50-4077-8a64-803dafa14ffb" (UID: "dccca567-2d50-4077-8a64-803dafa14ffb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.437327 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory" (OuterVolumeSpecName: "inventory") pod "dccca567-2d50-4077-8a64-803dafa14ffb" (UID: "dccca567-2d50-4077-8a64-803dafa14ffb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.439354 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dccca567-2d50-4077-8a64-803dafa14ffb" (UID: "dccca567-2d50-4077-8a64-803dafa14ffb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.512551 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9h8m\" (UniqueName: \"kubernetes.io/projected/dccca567-2d50-4077-8a64-803dafa14ffb-kube-api-access-s9h8m\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.512589 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.512601 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:35 crc kubenswrapper[4565]: I1125 09:39:35.512611 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dccca567-2d50-4077-8a64-803dafa14ffb-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.025817 4565 generic.go:334] "Generic (PLEG): container finished" podID="c71c0afe-6073-477a-84e0-2cf73d509930" containerID="50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea" exitCode=0 Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.026057 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerDied","Data":"50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea"} Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.026335 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerStarted","Data":"e829980e16a3217f855d4d1314a85d06a87733309fa9ae68a1fb8927e6b9308d"} Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.028383 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" event={"ID":"dccca567-2d50-4077-8a64-803dafa14ffb","Type":"ContainerDied","Data":"f672751dd65c57c4f308a11c728884152411738f0c7843a3d5be5b24d7b3090e"} Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.028412 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f672751dd65c57c4f308a11c728884152411738f0c7843a3d5be5b24d7b3090e" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.028473 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.117896 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9"] Nov 25 09:39:36 crc kubenswrapper[4565]: E1125 09:39:36.118303 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dccca567-2d50-4077-8a64-803dafa14ffb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.118322 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccca567-2d50-4077-8a64-803dafa14ffb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.118483 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="dccca567-2d50-4077-8a64-803dafa14ffb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.119052 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.120270 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.120774 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.122826 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.122910 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.123083 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.122914 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.123356 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.123404 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124234 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124286 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124396 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124483 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124524 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124596 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124632 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124762 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124794 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124893 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124910 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jcml\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.124957 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.136463 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9"] Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227373 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227492 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227581 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227621 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227717 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227742 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227823 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227859 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227885 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jcml\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227947 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.227989 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.228023 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.228131 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.235865 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.236399 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.236476 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.237140 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.237296 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.240421 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.240539 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.241803 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.242453 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.243463 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.243705 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.244093 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.249075 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jcml\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-txnr9\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:36 crc kubenswrapper[4565]: I1125 09:39:36.435635 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:39:37 crc kubenswrapper[4565]: I1125 09:39:37.009141 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9"] Nov 25 09:39:37 crc kubenswrapper[4565]: I1125 09:39:37.037883 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" event={"ID":"e277484b-b7d0-4a20-9551-a4d62a9720ea","Type":"ContainerStarted","Data":"676653d01ab1daf06f24a83a6a29c5b6071aa54674fa89b83ac494cb849515d7"} Nov 25 09:39:37 crc kubenswrapper[4565]: I1125 09:39:37.040269 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerStarted","Data":"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090"} Nov 25 09:39:38 crc kubenswrapper[4565]: I1125 09:39:38.054782 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" event={"ID":"e277484b-b7d0-4a20-9551-a4d62a9720ea","Type":"ContainerStarted","Data":"a6d8fd2dc16b61dec2020431886e7f2a704c5c2a072c37bcc12bfe468a5acfd0"} Nov 25 09:39:38 crc kubenswrapper[4565]: I1125 09:39:38.059257 4565 generic.go:334] "Generic (PLEG): container finished" podID="c71c0afe-6073-477a-84e0-2cf73d509930" containerID="549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090" exitCode=0 Nov 25 09:39:38 crc kubenswrapper[4565]: I1125 09:39:38.059337 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerDied","Data":"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090"} Nov 25 09:39:38 crc kubenswrapper[4565]: I1125 09:39:38.105310 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" podStartSLOduration=1.6132127 podStartE2EDuration="2.105285259s" podCreationTimestamp="2025-11-25 09:39:36 +0000 UTC" firstStartedPulling="2025-11-25 09:39:37.007271649 +0000 UTC m=+2110.209766787" lastFinishedPulling="2025-11-25 09:39:37.499344207 +0000 UTC m=+2110.701839346" observedRunningTime="2025-11-25 09:39:38.07409663 +0000 UTC m=+2111.276591768" watchObservedRunningTime="2025-11-25 09:39:38.105285259 +0000 UTC m=+2111.307780397" Nov 25 09:39:39 crc kubenswrapper[4565]: I1125 09:39:39.075288 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerStarted","Data":"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39"} Nov 25 09:39:39 crc kubenswrapper[4565]: I1125 09:39:39.107907 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnhgw" podStartSLOduration=2.470196177 podStartE2EDuration="5.107881731s" podCreationTimestamp="2025-11-25 09:39:34 +0000 UTC" firstStartedPulling="2025-11-25 09:39:36.02855182 +0000 UTC m=+2109.231046958" lastFinishedPulling="2025-11-25 09:39:38.666237374 +0000 UTC m=+2111.868732512" observedRunningTime="2025-11-25 09:39:39.097706879 +0000 UTC m=+2112.300202017" watchObservedRunningTime="2025-11-25 09:39:39.107881731 +0000 UTC m=+2112.310376870" Nov 25 09:39:44 crc kubenswrapper[4565]: I1125 09:39:44.691422 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:44 crc kubenswrapper[4565]: I1125 09:39:44.692310 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:44 crc kubenswrapper[4565]: I1125 09:39:44.749100 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:45 crc kubenswrapper[4565]: I1125 09:39:45.176882 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:45 crc kubenswrapper[4565]: I1125 09:39:45.227156 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.145889 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mnhgw" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="registry-server" containerID="cri-o://1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39" gracePeriod=2 Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.547728 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.684966 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content\") pod \"c71c0afe-6073-477a-84e0-2cf73d509930\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.685112 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdq7s\" (UniqueName: \"kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s\") pod \"c71c0afe-6073-477a-84e0-2cf73d509930\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.685370 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities\") pod \"c71c0afe-6073-477a-84e0-2cf73d509930\" (UID: \"c71c0afe-6073-477a-84e0-2cf73d509930\") " Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.686426 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities" (OuterVolumeSpecName: "utilities") pod "c71c0afe-6073-477a-84e0-2cf73d509930" (UID: "c71c0afe-6073-477a-84e0-2cf73d509930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.691081 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s" (OuterVolumeSpecName: "kube-api-access-tdq7s") pod "c71c0afe-6073-477a-84e0-2cf73d509930" (UID: "c71c0afe-6073-477a-84e0-2cf73d509930"). InnerVolumeSpecName "kube-api-access-tdq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.699034 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c71c0afe-6073-477a-84e0-2cf73d509930" (UID: "c71c0afe-6073-477a-84e0-2cf73d509930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.787831 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.787858 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71c0afe-6073-477a-84e0-2cf73d509930-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:47 crc kubenswrapper[4565]: I1125 09:39:47.787872 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdq7s\" (UniqueName: \"kubernetes.io/projected/c71c0afe-6073-477a-84e0-2cf73d509930-kube-api-access-tdq7s\") on node \"crc\" DevicePath \"\"" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.169669 4565 generic.go:334] "Generic (PLEG): container finished" podID="c71c0afe-6073-477a-84e0-2cf73d509930" containerID="1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39" exitCode=0 Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.169723 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerDied","Data":"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39"} Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.169757 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnhgw" event={"ID":"c71c0afe-6073-477a-84e0-2cf73d509930","Type":"ContainerDied","Data":"e829980e16a3217f855d4d1314a85d06a87733309fa9ae68a1fb8927e6b9308d"} Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.169782 4565 scope.go:117] "RemoveContainer" containerID="1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.170180 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnhgw" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.193119 4565 scope.go:117] "RemoveContainer" containerID="549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.213114 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.226327 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnhgw"] Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.228342 4565 scope.go:117] "RemoveContainer" containerID="50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.266989 4565 scope.go:117] "RemoveContainer" containerID="1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39" Nov 25 09:39:48 crc kubenswrapper[4565]: E1125 09:39:48.273297 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39\": container with ID starting with 1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39 not found: ID does not exist" containerID="1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.273346 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39"} err="failed to get container status \"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39\": rpc error: code = NotFound desc = could not find container \"1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39\": container with ID starting with 1a0e6b6f12879bec45229f4d2a45c54f81ade7dd05162737d7b53cc39c2dfd39 not found: ID does not exist" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.273423 4565 scope.go:117] "RemoveContainer" containerID="549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090" Nov 25 09:39:48 crc kubenswrapper[4565]: E1125 09:39:48.273866 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090\": container with ID starting with 549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090 not found: ID does not exist" containerID="549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.273897 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090"} err="failed to get container status \"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090\": rpc error: code = NotFound desc = could not find container \"549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090\": container with ID starting with 549fdb384da5a1009ad473231543eba068fe5f021f2bd5eedf1f6bcb1f045090 not found: ID does not exist" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.273921 4565 scope.go:117] "RemoveContainer" containerID="50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea" Nov 25 09:39:48 crc kubenswrapper[4565]: E1125 09:39:48.275317 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea\": container with ID starting with 50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea not found: ID does not exist" containerID="50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea" Nov 25 09:39:48 crc kubenswrapper[4565]: I1125 09:39:48.275338 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea"} err="failed to get container status \"50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea\": rpc error: code = NotFound desc = could not find container \"50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea\": container with ID starting with 50c88094b1c975bc3cc0ffdd086579fff3f8221a28e76a3b798d83301ef583ea not found: ID does not exist" Nov 25 09:39:49 crc kubenswrapper[4565]: I1125 09:39:49.109882 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" path="/var/lib/kubelet/pods/c71c0afe-6073-477a-84e0-2cf73d509930/volumes" Nov 25 09:39:55 crc kubenswrapper[4565]: I1125 09:39:55.099394 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:39:55 crc kubenswrapper[4565]: I1125 09:39:55.100129 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:40:02 crc kubenswrapper[4565]: I1125 09:40:02.298305 4565 generic.go:334] "Generic (PLEG): container finished" podID="e277484b-b7d0-4a20-9551-a4d62a9720ea" containerID="a6d8fd2dc16b61dec2020431886e7f2a704c5c2a072c37bcc12bfe468a5acfd0" exitCode=0 Nov 25 09:40:02 crc kubenswrapper[4565]: I1125 09:40:02.298366 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" event={"ID":"e277484b-b7d0-4a20-9551-a4d62a9720ea","Type":"ContainerDied","Data":"a6d8fd2dc16b61dec2020431886e7f2a704c5c2a072c37bcc12bfe468a5acfd0"} Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.713182 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.865373 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.865705 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.865775 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.865847 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.865914 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866034 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866121 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866161 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866253 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866287 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866310 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866331 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.866368 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jcml\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml\") pod \"e277484b-b7d0-4a20-9551-a4d62a9720ea\" (UID: \"e277484b-b7d0-4a20-9551-a4d62a9720ea\") " Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.879020 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.880698 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.881256 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph" (OuterVolumeSpecName: "ceph") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.881766 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml" (OuterVolumeSpecName: "kube-api-access-8jcml") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "kube-api-access-8jcml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.883001 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.884028 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.900820 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.900851 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.900860 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.901246 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.902902 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.906288 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory" (OuterVolumeSpecName: "inventory") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.917016 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e277484b-b7d0-4a20-9551-a4d62a9720ea" (UID: "e277484b-b7d0-4a20-9551-a4d62a9720ea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969195 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969228 4565 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969241 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969255 4565 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969267 4565 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969276 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969286 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969297 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jcml\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-kube-api-access-8jcml\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969308 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e277484b-b7d0-4a20-9551-a4d62a9720ea-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969319 4565 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969329 4565 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969338 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:03 crc kubenswrapper[4565]: I1125 09:40:03.969348 4565 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e277484b-b7d0-4a20-9551-a4d62a9720ea-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.318493 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" event={"ID":"e277484b-b7d0-4a20-9551-a4d62a9720ea","Type":"ContainerDied","Data":"676653d01ab1daf06f24a83a6a29c5b6071aa54674fa89b83ac494cb849515d7"} Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.318574 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676653d01ab1daf06f24a83a6a29c5b6071aa54674fa89b83ac494cb849515d7" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.318915 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-txnr9" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.398944 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx"] Nov 25 09:40:04 crc kubenswrapper[4565]: E1125 09:40:04.399565 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="extract-content" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399586 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="extract-content" Nov 25 09:40:04 crc kubenswrapper[4565]: E1125 09:40:04.399619 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="registry-server" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399625 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="registry-server" Nov 25 09:40:04 crc kubenswrapper[4565]: E1125 09:40:04.399640 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="extract-utilities" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399647 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="extract-utilities" Nov 25 09:40:04 crc kubenswrapper[4565]: E1125 09:40:04.399667 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e277484b-b7d0-4a20-9551-a4d62a9720ea" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399673 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e277484b-b7d0-4a20-9551-a4d62a9720ea" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399847 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71c0afe-6073-477a-84e0-2cf73d509930" containerName="registry-server" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.399872 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e277484b-b7d0-4a20-9551-a4d62a9720ea" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.400468 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.402535 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.402836 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.403561 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.403757 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.404776 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.409348 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx"] Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.583950 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.584346 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.584565 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.584820 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bldbr\" (UniqueName: \"kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.687212 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.687314 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.687362 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.687405 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bldbr\" (UniqueName: \"kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.694313 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.694987 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.695389 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.704226 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bldbr\" (UniqueName: \"kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:04 crc kubenswrapper[4565]: I1125 09:40:04.713961 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:05 crc kubenswrapper[4565]: I1125 09:40:05.160750 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx"] Nov 25 09:40:05 crc kubenswrapper[4565]: I1125 09:40:05.329266 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" event={"ID":"d22f3d2a-b6ff-4188-9778-e7108dd44f3a","Type":"ContainerStarted","Data":"3d01fce58718da472793172ff794cd10beaa3cae0091255fb2f62a01fabb731e"} Nov 25 09:40:07 crc kubenswrapper[4565]: I1125 09:40:07.352740 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" event={"ID":"d22f3d2a-b6ff-4188-9778-e7108dd44f3a","Type":"ContainerStarted","Data":"eb12db5f64d051899700f16014e65528f344f985ad7ccc49f098f06cb41f47d1"} Nov 25 09:40:07 crc kubenswrapper[4565]: I1125 09:40:07.379122 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" podStartSLOduration=1.600715477 podStartE2EDuration="3.379104514s" podCreationTimestamp="2025-11-25 09:40:04 +0000 UTC" firstStartedPulling="2025-11-25 09:40:05.166224578 +0000 UTC m=+2138.368719716" lastFinishedPulling="2025-11-25 09:40:06.944613615 +0000 UTC m=+2140.147108753" observedRunningTime="2025-11-25 09:40:07.368394453 +0000 UTC m=+2140.570889581" watchObservedRunningTime="2025-11-25 09:40:07.379104514 +0000 UTC m=+2140.581599652" Nov 25 09:40:12 crc kubenswrapper[4565]: I1125 09:40:12.399713 4565 generic.go:334] "Generic (PLEG): container finished" podID="d22f3d2a-b6ff-4188-9778-e7108dd44f3a" containerID="eb12db5f64d051899700f16014e65528f344f985ad7ccc49f098f06cb41f47d1" exitCode=0 Nov 25 09:40:12 crc kubenswrapper[4565]: I1125 09:40:12.400386 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" event={"ID":"d22f3d2a-b6ff-4188-9778-e7108dd44f3a","Type":"ContainerDied","Data":"eb12db5f64d051899700f16014e65528f344f985ad7ccc49f098f06cb41f47d1"} Nov 25 09:40:13 crc kubenswrapper[4565]: I1125 09:40:13.808774 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:13 crc kubenswrapper[4565]: I1125 09:40:13.989989 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bldbr\" (UniqueName: \"kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr\") pod \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " Nov 25 09:40:13 crc kubenswrapper[4565]: I1125 09:40:13.990176 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph\") pod \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " Nov 25 09:40:13 crc kubenswrapper[4565]: I1125 09:40:13.990203 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key\") pod \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " Nov 25 09:40:13 crc kubenswrapper[4565]: I1125 09:40:13.990269 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory\") pod \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\" (UID: \"d22f3d2a-b6ff-4188-9778-e7108dd44f3a\") " Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.033211 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph" (OuterVolumeSpecName: "ceph") pod "d22f3d2a-b6ff-4188-9778-e7108dd44f3a" (UID: "d22f3d2a-b6ff-4188-9778-e7108dd44f3a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.037082 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr" (OuterVolumeSpecName: "kube-api-access-bldbr") pod "d22f3d2a-b6ff-4188-9778-e7108dd44f3a" (UID: "d22f3d2a-b6ff-4188-9778-e7108dd44f3a"). InnerVolumeSpecName "kube-api-access-bldbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.077412 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d22f3d2a-b6ff-4188-9778-e7108dd44f3a" (UID: "d22f3d2a-b6ff-4188-9778-e7108dd44f3a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.079605 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory" (OuterVolumeSpecName: "inventory") pod "d22f3d2a-b6ff-4188-9778-e7108dd44f3a" (UID: "d22f3d2a-b6ff-4188-9778-e7108dd44f3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.096712 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.096735 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.096747 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.096762 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bldbr\" (UniqueName: \"kubernetes.io/projected/d22f3d2a-b6ff-4188-9778-e7108dd44f3a-kube-api-access-bldbr\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.420589 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" event={"ID":"d22f3d2a-b6ff-4188-9778-e7108dd44f3a","Type":"ContainerDied","Data":"3d01fce58718da472793172ff794cd10beaa3cae0091255fb2f62a01fabb731e"} Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.420973 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d01fce58718da472793172ff794cd10beaa3cae0091255fb2f62a01fabb731e" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.420666 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.520839 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn"] Nov 25 09:40:14 crc kubenswrapper[4565]: E1125 09:40:14.521192 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22f3d2a-b6ff-4188-9778-e7108dd44f3a" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.521212 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22f3d2a-b6ff-4188-9778-e7108dd44f3a" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.521389 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22f3d2a-b6ff-4188-9778-e7108dd44f3a" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.521951 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.523720 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.524026 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.524348 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.526366 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.534459 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn"] Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.535693 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.536782 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.707534 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.707602 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.708341 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.708425 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdzj\" (UniqueName: \"kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.708869 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.709255 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.811969 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.812700 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdzj\" (UniqueName: \"kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.812768 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.812986 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.813068 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.813302 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.813392 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.817958 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.818382 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.818701 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.821657 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.828731 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdzj\" (UniqueName: \"kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2rrn\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:14 crc kubenswrapper[4565]: I1125 09:40:14.835412 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:40:15 crc kubenswrapper[4565]: I1125 09:40:15.354102 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn"] Nov 25 09:40:15 crc kubenswrapper[4565]: I1125 09:40:15.431982 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" event={"ID":"eedd2b64-c2c0-43dd-a5d9-ee7508387909","Type":"ContainerStarted","Data":"c01303c2c3c674172b8aa8f950e88a43d181b026eecee9307cfb2e29d22f697e"} Nov 25 09:40:16 crc kubenswrapper[4565]: I1125 09:40:16.442691 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" event={"ID":"eedd2b64-c2c0-43dd-a5d9-ee7508387909","Type":"ContainerStarted","Data":"713c5c67f7722e9ced950a44d6b8b93df218374fd67a63ed5299cf2d2eee60cd"} Nov 25 09:40:16 crc kubenswrapper[4565]: I1125 09:40:16.459779 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" podStartSLOduration=1.950944617 podStartE2EDuration="2.459745119s" podCreationTimestamp="2025-11-25 09:40:14 +0000 UTC" firstStartedPulling="2025-11-25 09:40:15.365388522 +0000 UTC m=+2148.567883660" lastFinishedPulling="2025-11-25 09:40:15.874189024 +0000 UTC m=+2149.076684162" observedRunningTime="2025-11-25 09:40:16.456226466 +0000 UTC m=+2149.658721604" watchObservedRunningTime="2025-11-25 09:40:16.459745119 +0000 UTC m=+2149.662240258" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.147729 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.149917 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.165550 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.296459 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.296543 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.297691 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqvf\" (UniqueName: \"kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.399536 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.399851 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqvf\" (UniqueName: \"kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.399968 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.400270 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.400511 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.427864 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqvf\" (UniqueName: \"kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf\") pod \"redhat-operators-dvrz9\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.467738 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:23 crc kubenswrapper[4565]: I1125 09:40:23.931431 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:24 crc kubenswrapper[4565]: I1125 09:40:24.510024 4565 generic.go:334] "Generic (PLEG): container finished" podID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerID="b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164" exitCode=0 Nov 25 09:40:24 crc kubenswrapper[4565]: I1125 09:40:24.510133 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerDied","Data":"b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164"} Nov 25 09:40:24 crc kubenswrapper[4565]: I1125 09:40:24.510525 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerStarted","Data":"2fccf48bfa600f9490b2d3e34e2777a827190a9dda6571bae74b0e083d24344e"} Nov 25 09:40:25 crc kubenswrapper[4565]: I1125 09:40:25.099275 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:40:25 crc kubenswrapper[4565]: I1125 09:40:25.099358 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:40:25 crc kubenswrapper[4565]: I1125 09:40:25.522234 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerStarted","Data":"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9"} Nov 25 09:40:28 crc kubenswrapper[4565]: I1125 09:40:28.552576 4565 generic.go:334] "Generic (PLEG): container finished" podID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerID="88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9" exitCode=0 Nov 25 09:40:28 crc kubenswrapper[4565]: I1125 09:40:28.552660 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerDied","Data":"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9"} Nov 25 09:40:29 crc kubenswrapper[4565]: I1125 09:40:29.565211 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerStarted","Data":"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484"} Nov 25 09:40:29 crc kubenswrapper[4565]: I1125 09:40:29.588221 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dvrz9" podStartSLOduration=2.09502268 podStartE2EDuration="6.588206303s" podCreationTimestamp="2025-11-25 09:40:23 +0000 UTC" firstStartedPulling="2025-11-25 09:40:24.51318801 +0000 UTC m=+2157.715683148" lastFinishedPulling="2025-11-25 09:40:29.006371633 +0000 UTC m=+2162.208866771" observedRunningTime="2025-11-25 09:40:29.583024654 +0000 UTC m=+2162.785519792" watchObservedRunningTime="2025-11-25 09:40:29.588206303 +0000 UTC m=+2162.790701442" Nov 25 09:40:33 crc kubenswrapper[4565]: I1125 09:40:33.468535 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:33 crc kubenswrapper[4565]: I1125 09:40:33.469367 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:34 crc kubenswrapper[4565]: I1125 09:40:34.509003 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dvrz9" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="registry-server" probeResult="failure" output=< Nov 25 09:40:34 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:40:34 crc kubenswrapper[4565]: > Nov 25 09:40:43 crc kubenswrapper[4565]: I1125 09:40:43.509563 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:43 crc kubenswrapper[4565]: I1125 09:40:43.564323 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:43 crc kubenswrapper[4565]: I1125 09:40:43.756611 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:44 crc kubenswrapper[4565]: I1125 09:40:44.752331 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dvrz9" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="registry-server" containerID="cri-o://c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484" gracePeriod=2 Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.398472 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.486419 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities\") pod \"d7316ee2-681b-4db6-ae80-1b4807963ba4\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.486474 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvqvf\" (UniqueName: \"kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf\") pod \"d7316ee2-681b-4db6-ae80-1b4807963ba4\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.486514 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content\") pod \"d7316ee2-681b-4db6-ae80-1b4807963ba4\" (UID: \"d7316ee2-681b-4db6-ae80-1b4807963ba4\") " Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.487252 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities" (OuterVolumeSpecName: "utilities") pod "d7316ee2-681b-4db6-ae80-1b4807963ba4" (UID: "d7316ee2-681b-4db6-ae80-1b4807963ba4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.487584 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.496426 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf" (OuterVolumeSpecName: "kube-api-access-fvqvf") pod "d7316ee2-681b-4db6-ae80-1b4807963ba4" (UID: "d7316ee2-681b-4db6-ae80-1b4807963ba4"). InnerVolumeSpecName "kube-api-access-fvqvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.573491 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7316ee2-681b-4db6-ae80-1b4807963ba4" (UID: "d7316ee2-681b-4db6-ae80-1b4807963ba4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.590429 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvqvf\" (UniqueName: \"kubernetes.io/projected/d7316ee2-681b-4db6-ae80-1b4807963ba4-kube-api-access-fvqvf\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.590550 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7316ee2-681b-4db6-ae80-1b4807963ba4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.762282 4565 generic.go:334] "Generic (PLEG): container finished" podID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerID="c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484" exitCode=0 Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.762336 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerDied","Data":"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484"} Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.762367 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvrz9" event={"ID":"d7316ee2-681b-4db6-ae80-1b4807963ba4","Type":"ContainerDied","Data":"2fccf48bfa600f9490b2d3e34e2777a827190a9dda6571bae74b0e083d24344e"} Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.762386 4565 scope.go:117] "RemoveContainer" containerID="c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.762518 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvrz9" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.785959 4565 scope.go:117] "RemoveContainer" containerID="88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.795965 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.801585 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dvrz9"] Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.827547 4565 scope.go:117] "RemoveContainer" containerID="b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.863082 4565 scope.go:117] "RemoveContainer" containerID="c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484" Nov 25 09:40:45 crc kubenswrapper[4565]: E1125 09:40:45.863464 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484\": container with ID starting with c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484 not found: ID does not exist" containerID="c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.863510 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484"} err="failed to get container status \"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484\": rpc error: code = NotFound desc = could not find container \"c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484\": container with ID starting with c6c21c9a6f54be635ac28bf6f339aeb465044e8aaf4eb5a7dab5ef8d90b2e484 not found: ID does not exist" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.863541 4565 scope.go:117] "RemoveContainer" containerID="88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9" Nov 25 09:40:45 crc kubenswrapper[4565]: E1125 09:40:45.863842 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9\": container with ID starting with 88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9 not found: ID does not exist" containerID="88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.863865 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9"} err="failed to get container status \"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9\": rpc error: code = NotFound desc = could not find container \"88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9\": container with ID starting with 88f238088f0a112b7ccd9e473c4c0f1e63869797f4720739e520d7fa390177b9 not found: ID does not exist" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.863882 4565 scope.go:117] "RemoveContainer" containerID="b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164" Nov 25 09:40:45 crc kubenswrapper[4565]: E1125 09:40:45.864332 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164\": container with ID starting with b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164 not found: ID does not exist" containerID="b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164" Nov 25 09:40:45 crc kubenswrapper[4565]: I1125 09:40:45.864355 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164"} err="failed to get container status \"b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164\": rpc error: code = NotFound desc = could not find container \"b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164\": container with ID starting with b69e3cddd5ee45fcf926dcd47c59bc17cbe5b12f6833f020c156a6500a1dd164 not found: ID does not exist" Nov 25 09:40:47 crc kubenswrapper[4565]: I1125 09:40:47.108045 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" path="/var/lib/kubelet/pods/d7316ee2-681b-4db6-ae80-1b4807963ba4/volumes" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.099078 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.099376 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.105302 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.105811 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.105878 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" gracePeriod=600 Nov 25 09:40:55 crc kubenswrapper[4565]: E1125 09:40:55.225030 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.849317 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" exitCode=0 Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.849382 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843"} Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.849732 4565 scope.go:117] "RemoveContainer" containerID="538bdafdafa57c950bb08b4eaa710350c74e791fe340685c16a050dcd97f3d53" Nov 25 09:40:55 crc kubenswrapper[4565]: I1125 09:40:55.851132 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:40:55 crc kubenswrapper[4565]: E1125 09:40:55.851372 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:07 crc kubenswrapper[4565]: I1125 09:41:07.103114 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:41:07 crc kubenswrapper[4565]: E1125 09:41:07.104322 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:13 crc kubenswrapper[4565]: I1125 09:41:13.023494 4565 generic.go:334] "Generic (PLEG): container finished" podID="eedd2b64-c2c0-43dd-a5d9-ee7508387909" containerID="713c5c67f7722e9ced950a44d6b8b93df218374fd67a63ed5299cf2d2eee60cd" exitCode=0 Nov 25 09:41:13 crc kubenswrapper[4565]: I1125 09:41:13.023696 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" event={"ID":"eedd2b64-c2c0-43dd-a5d9-ee7508387909","Type":"ContainerDied","Data":"713c5c67f7722e9ced950a44d6b8b93df218374fd67a63ed5299cf2d2eee60cd"} Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.453813 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496199 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496255 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdzj\" (UniqueName: \"kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496307 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496344 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496426 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.496452 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph\") pod \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\" (UID: \"eedd2b64-c2c0-43dd-a5d9-ee7508387909\") " Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.517845 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.518203 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph" (OuterVolumeSpecName: "ceph") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.518912 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.520904 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory" (OuterVolumeSpecName: "inventory") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.522194 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj" (OuterVolumeSpecName: "kube-api-access-thdzj") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "kube-api-access-thdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.535221 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eedd2b64-c2c0-43dd-a5d9-ee7508387909" (UID: "eedd2b64-c2c0-43dd-a5d9-ee7508387909"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600110 4565 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600140 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600152 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600168 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdzj\" (UniqueName: \"kubernetes.io/projected/eedd2b64-c2c0-43dd-a5d9-ee7508387909-kube-api-access-thdzj\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600182 4565 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:14 crc kubenswrapper[4565]: I1125 09:41:14.600191 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eedd2b64-c2c0-43dd-a5d9-ee7508387909-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.045065 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" event={"ID":"eedd2b64-c2c0-43dd-a5d9-ee7508387909","Type":"ContainerDied","Data":"c01303c2c3c674172b8aa8f950e88a43d181b026eecee9307cfb2e29d22f697e"} Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.045448 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c01303c2c3c674172b8aa8f950e88a43d181b026eecee9307cfb2e29d22f697e" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.045167 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2rrn" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.172903 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw"] Nov 25 09:41:15 crc kubenswrapper[4565]: E1125 09:41:15.174595 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedd2b64-c2c0-43dd-a5d9-ee7508387909" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.174634 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedd2b64-c2c0-43dd-a5d9-ee7508387909" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 09:41:15 crc kubenswrapper[4565]: E1125 09:41:15.174676 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="extract-content" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.174684 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="extract-content" Nov 25 09:41:15 crc kubenswrapper[4565]: E1125 09:41:15.174743 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="extract-utilities" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.174772 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="extract-utilities" Nov 25 09:41:15 crc kubenswrapper[4565]: E1125 09:41:15.174820 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="registry-server" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.174829 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="registry-server" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.175441 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedd2b64-c2c0-43dd-a5d9-ee7508387909" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.175479 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7316ee2-681b-4db6-ae80-1b4807963ba4" containerName="registry-server" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.177816 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.182005 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.182284 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.185261 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.188238 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.188544 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.188720 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.191378 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.203493 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw"] Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212274 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212378 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212429 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212482 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212573 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcsf7\" (UniqueName: \"kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212673 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.212746 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.314737 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.314877 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.314915 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.314967 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.314991 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.315033 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcsf7\" (UniqueName: \"kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.315087 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.320479 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.320601 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.320599 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.320714 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.322396 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.322845 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.331501 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcsf7\" (UniqueName: \"kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:15 crc kubenswrapper[4565]: I1125 09:41:15.511179 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:41:16 crc kubenswrapper[4565]: I1125 09:41:16.019691 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw"] Nov 25 09:41:16 crc kubenswrapper[4565]: W1125 09:41:16.024616 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a1f544_cbde_40e4_aec7_72347718b75d.slice/crio-c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021 WatchSource:0}: Error finding container c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021: Status 404 returned error can't find the container with id c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021 Nov 25 09:41:16 crc kubenswrapper[4565]: I1125 09:41:16.055113 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" event={"ID":"f5a1f544-cbde-40e4-aec7-72347718b75d","Type":"ContainerStarted","Data":"c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021"} Nov 25 09:41:17 crc kubenswrapper[4565]: I1125 09:41:17.065524 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" event={"ID":"f5a1f544-cbde-40e4-aec7-72347718b75d","Type":"ContainerStarted","Data":"7a5d0e7e0299a8bbeb5bc13f2d8a9b379745df951ac0240597c40c7a6ad3daa8"} Nov 25 09:41:17 crc kubenswrapper[4565]: I1125 09:41:17.087068 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" podStartSLOduration=1.325681555 podStartE2EDuration="2.087048741s" podCreationTimestamp="2025-11-25 09:41:15 +0000 UTC" firstStartedPulling="2025-11-25 09:41:16.028384923 +0000 UTC m=+2209.230880061" lastFinishedPulling="2025-11-25 09:41:16.789752109 +0000 UTC m=+2209.992247247" observedRunningTime="2025-11-25 09:41:17.079639131 +0000 UTC m=+2210.282134279" watchObservedRunningTime="2025-11-25 09:41:17.087048741 +0000 UTC m=+2210.289543878" Nov 25 09:41:19 crc kubenswrapper[4565]: I1125 09:41:19.098091 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:41:19 crc kubenswrapper[4565]: E1125 09:41:19.098879 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:30 crc kubenswrapper[4565]: I1125 09:41:30.097960 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:41:30 crc kubenswrapper[4565]: E1125 09:41:30.098900 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:41 crc kubenswrapper[4565]: I1125 09:41:41.098257 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:41:41 crc kubenswrapper[4565]: E1125 09:41:41.099255 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:56 crc kubenswrapper[4565]: I1125 09:41:56.097647 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:41:56 crc kubenswrapper[4565]: E1125 09:41:56.098624 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.699791 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.702686 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.728801 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.729152 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.729381 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qh7s\" (UniqueName: \"kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.756267 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.831594 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qh7s\" (UniqueName: \"kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.831656 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.831867 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.832432 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.832498 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:57 crc kubenswrapper[4565]: I1125 09:41:57.850132 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qh7s\" (UniqueName: \"kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s\") pod \"certified-operators-jldqn\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:58 crc kubenswrapper[4565]: I1125 09:41:58.028003 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:41:58 crc kubenswrapper[4565]: I1125 09:41:58.527966 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:41:59 crc kubenswrapper[4565]: I1125 09:41:59.502895 4565 generic.go:334] "Generic (PLEG): container finished" podID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerID="50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47" exitCode=0 Nov 25 09:41:59 crc kubenswrapper[4565]: I1125 09:41:59.502964 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerDied","Data":"50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47"} Nov 25 09:41:59 crc kubenswrapper[4565]: I1125 09:41:59.503200 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerStarted","Data":"86293a2b941ff2a6524c31b5ddb21f215354707922a542e7211a660f72002ce1"} Nov 25 09:42:00 crc kubenswrapper[4565]: I1125 09:42:00.516699 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerStarted","Data":"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588"} Nov 25 09:42:01 crc kubenswrapper[4565]: I1125 09:42:01.528608 4565 generic.go:334] "Generic (PLEG): container finished" podID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerID="ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588" exitCode=0 Nov 25 09:42:01 crc kubenswrapper[4565]: I1125 09:42:01.528743 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerDied","Data":"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588"} Nov 25 09:42:02 crc kubenswrapper[4565]: I1125 09:42:02.541053 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerStarted","Data":"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85"} Nov 25 09:42:02 crc kubenswrapper[4565]: I1125 09:42:02.563297 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jldqn" podStartSLOduration=3.085838865 podStartE2EDuration="5.563267545s" podCreationTimestamp="2025-11-25 09:41:57 +0000 UTC" firstStartedPulling="2025-11-25 09:41:59.504642229 +0000 UTC m=+2252.707137367" lastFinishedPulling="2025-11-25 09:42:01.982070908 +0000 UTC m=+2255.184566047" observedRunningTime="2025-11-25 09:42:02.55884614 +0000 UTC m=+2255.761341278" watchObservedRunningTime="2025-11-25 09:42:02.563267545 +0000 UTC m=+2255.765762683" Nov 25 09:42:04 crc kubenswrapper[4565]: I1125 09:42:04.568527 4565 generic.go:334] "Generic (PLEG): container finished" podID="f5a1f544-cbde-40e4-aec7-72347718b75d" containerID="7a5d0e7e0299a8bbeb5bc13f2d8a9b379745df951ac0240597c40c7a6ad3daa8" exitCode=0 Nov 25 09:42:04 crc kubenswrapper[4565]: I1125 09:42:04.568601 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" event={"ID":"f5a1f544-cbde-40e4-aec7-72347718b75d","Type":"ContainerDied","Data":"7a5d0e7e0299a8bbeb5bc13f2d8a9b379745df951ac0240597c40c7a6ad3daa8"} Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.007211 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204455 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204553 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204586 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcsf7\" (UniqueName: \"kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204634 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204657 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204678 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.204724 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f5a1f544-cbde-40e4-aec7-72347718b75d\" (UID: \"f5a1f544-cbde-40e4-aec7-72347718b75d\") " Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.214117 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.220079 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph" (OuterVolumeSpecName: "ceph") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.222121 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7" (OuterVolumeSpecName: "kube-api-access-hcsf7") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "kube-api-access-hcsf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.232410 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory" (OuterVolumeSpecName: "inventory") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.232794 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.234306 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.234722 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f5a1f544-cbde-40e4-aec7-72347718b75d" (UID: "f5a1f544-cbde-40e4-aec7-72347718b75d"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307698 4565 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307729 4565 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307743 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcsf7\" (UniqueName: \"kubernetes.io/projected/f5a1f544-cbde-40e4-aec7-72347718b75d-kube-api-access-hcsf7\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307758 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307766 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307776 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.307785 4565 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a1f544-cbde-40e4-aec7-72347718b75d-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.601404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" event={"ID":"f5a1f544-cbde-40e4-aec7-72347718b75d","Type":"ContainerDied","Data":"c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021"} Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.601472 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86dd03831ec186c6868618c76ff8cbc1274af2dd8fe77ca7ae2011391ea4021" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.601587 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.681031 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99"] Nov 25 09:42:06 crc kubenswrapper[4565]: E1125 09:42:06.681393 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a1f544-cbde-40e4-aec7-72347718b75d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.681412 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a1f544-cbde-40e4-aec7-72347718b75d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.681624 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a1f544-cbde-40e4-aec7-72347718b75d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.682205 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.683883 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.684153 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.684305 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:42:06 crc kubenswrapper[4565]: W1125 09:42:06.684513 4565 reflector.go:561] object-"openstack"/"libvirt-secret": failed to list *v1.Secret: secrets "libvirt-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Nov 25 09:42:06 crc kubenswrapper[4565]: E1125 09:42:06.684540 4565 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"libvirt-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"libvirt-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.684984 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.685206 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.701032 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99"] Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818093 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818329 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818476 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5tz\" (UniqueName: \"kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818554 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818588 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.818619 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.920980 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.921040 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.921070 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.921138 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.921191 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.921264 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5tz\" (UniqueName: \"kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.927737 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.928176 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.928370 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.928698 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:06 crc kubenswrapper[4565]: I1125 09:42:06.937547 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5tz\" (UniqueName: \"kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:07 crc kubenswrapper[4565]: I1125 09:42:07.882480 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 25 09:42:07 crc kubenswrapper[4565]: I1125 09:42:07.895563 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xqn99\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:07 crc kubenswrapper[4565]: I1125 09:42:07.905029 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.028306 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.030047 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.079778 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.442381 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99"] Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.622210 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" event={"ID":"e1061e52-8553-4932-9689-83016e2b413f","Type":"ContainerStarted","Data":"52e5a866e74c6a06b962511c4eea56dbab35a8c30063d868b7bafb24278a4c32"} Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.665811 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:08 crc kubenswrapper[4565]: I1125 09:42:08.723657 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:42:09 crc kubenswrapper[4565]: I1125 09:42:09.098210 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:42:09 crc kubenswrapper[4565]: E1125 09:42:09.098762 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:42:09 crc kubenswrapper[4565]: I1125 09:42:09.635341 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" event={"ID":"e1061e52-8553-4932-9689-83016e2b413f","Type":"ContainerStarted","Data":"2b8a742001a8a88e227e92afd5a2522236bc40c43b37e1dca2c96ad55d7ceb45"} Nov 25 09:42:09 crc kubenswrapper[4565]: I1125 09:42:09.660008 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" podStartSLOduration=3.166087052 podStartE2EDuration="3.659763121s" podCreationTimestamp="2025-11-25 09:42:06 +0000 UTC" firstStartedPulling="2025-11-25 09:42:08.461827409 +0000 UTC m=+2261.664322547" lastFinishedPulling="2025-11-25 09:42:08.955503478 +0000 UTC m=+2262.157998616" observedRunningTime="2025-11-25 09:42:09.648135038 +0000 UTC m=+2262.850630176" watchObservedRunningTime="2025-11-25 09:42:09.659763121 +0000 UTC m=+2262.862258259" Nov 25 09:42:10 crc kubenswrapper[4565]: I1125 09:42:10.644006 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jldqn" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="registry-server" containerID="cri-o://da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85" gracePeriod=2 Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.136293 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.212097 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content\") pod \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.212256 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qh7s\" (UniqueName: \"kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s\") pod \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.212360 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities\") pod \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\" (UID: \"8a35f9c7-0f29-4ab3-98ef-463d88be4a17\") " Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.213612 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities" (OuterVolumeSpecName: "utilities") pod "8a35f9c7-0f29-4ab3-98ef-463d88be4a17" (UID: "8a35f9c7-0f29-4ab3-98ef-463d88be4a17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.224425 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s" (OuterVolumeSpecName: "kube-api-access-5qh7s") pod "8a35f9c7-0f29-4ab3-98ef-463d88be4a17" (UID: "8a35f9c7-0f29-4ab3-98ef-463d88be4a17"). InnerVolumeSpecName "kube-api-access-5qh7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.259402 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a35f9c7-0f29-4ab3-98ef-463d88be4a17" (UID: "8a35f9c7-0f29-4ab3-98ef-463d88be4a17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.314669 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qh7s\" (UniqueName: \"kubernetes.io/projected/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-kube-api-access-5qh7s\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.314708 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.314722 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a35f9c7-0f29-4ab3-98ef-463d88be4a17-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.656469 4565 generic.go:334] "Generic (PLEG): container finished" podID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerID="da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85" exitCode=0 Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.656525 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerDied","Data":"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85"} Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.656572 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jldqn" event={"ID":"8a35f9c7-0f29-4ab3-98ef-463d88be4a17","Type":"ContainerDied","Data":"86293a2b941ff2a6524c31b5ddb21f215354707922a542e7211a660f72002ce1"} Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.656598 4565 scope.go:117] "RemoveContainer" containerID="da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.656631 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jldqn" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.688727 4565 scope.go:117] "RemoveContainer" containerID="ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.690280 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.713626 4565 scope.go:117] "RemoveContainer" containerID="50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.733144 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jldqn"] Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.768645 4565 scope.go:117] "RemoveContainer" containerID="da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85" Nov 25 09:42:11 crc kubenswrapper[4565]: E1125 09:42:11.769419 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85\": container with ID starting with da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85 not found: ID does not exist" containerID="da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.769463 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85"} err="failed to get container status \"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85\": rpc error: code = NotFound desc = could not find container \"da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85\": container with ID starting with da0497c788d6fb38aabf7feb9e2d207dcdb5114f98a682301a84f2c59fd99c85 not found: ID does not exist" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.769494 4565 scope.go:117] "RemoveContainer" containerID="ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588" Nov 25 09:42:11 crc kubenswrapper[4565]: E1125 09:42:11.769812 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588\": container with ID starting with ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588 not found: ID does not exist" containerID="ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.769848 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588"} err="failed to get container status \"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588\": rpc error: code = NotFound desc = could not find container \"ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588\": container with ID starting with ff5ad8222e531998647b8e603204ccaa22a1fdfd7cd8f858f82e4b9971d5d588 not found: ID does not exist" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.769872 4565 scope.go:117] "RemoveContainer" containerID="50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47" Nov 25 09:42:11 crc kubenswrapper[4565]: E1125 09:42:11.770079 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47\": container with ID starting with 50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47 not found: ID does not exist" containerID="50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47" Nov 25 09:42:11 crc kubenswrapper[4565]: I1125 09:42:11.770096 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47"} err="failed to get container status \"50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47\": rpc error: code = NotFound desc = could not find container \"50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47\": container with ID starting with 50e56704785ce578620c05cdf92c7d0fcd0432d2ecba43505e74bc12571fee47 not found: ID does not exist" Nov 25 09:42:13 crc kubenswrapper[4565]: I1125 09:42:13.107561 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" path="/var/lib/kubelet/pods/8a35f9c7-0f29-4ab3-98ef-463d88be4a17/volumes" Nov 25 09:42:23 crc kubenswrapper[4565]: I1125 09:42:23.097138 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:42:23 crc kubenswrapper[4565]: E1125 09:42:23.098281 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:42:35 crc kubenswrapper[4565]: I1125 09:42:35.097661 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:42:35 crc kubenswrapper[4565]: E1125 09:42:35.098374 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:42:46 crc kubenswrapper[4565]: I1125 09:42:46.098332 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:42:46 crc kubenswrapper[4565]: E1125 09:42:46.099353 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:43:00 crc kubenswrapper[4565]: I1125 09:43:00.097687 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:43:00 crc kubenswrapper[4565]: E1125 09:43:00.098729 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:43:15 crc kubenswrapper[4565]: I1125 09:43:15.097578 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:43:15 crc kubenswrapper[4565]: E1125 09:43:15.099172 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:43:28 crc kubenswrapper[4565]: I1125 09:43:28.097916 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:43:28 crc kubenswrapper[4565]: E1125 09:43:28.099102 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:43:43 crc kubenswrapper[4565]: I1125 09:43:43.097736 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:43:43 crc kubenswrapper[4565]: E1125 09:43:43.098715 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:43:57 crc kubenswrapper[4565]: I1125 09:43:57.102587 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:43:57 crc kubenswrapper[4565]: E1125 09:43:57.103786 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:44:12 crc kubenswrapper[4565]: I1125 09:44:12.097848 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:44:12 crc kubenswrapper[4565]: E1125 09:44:12.098862 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:44:25 crc kubenswrapper[4565]: I1125 09:44:25.097558 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:44:25 crc kubenswrapper[4565]: E1125 09:44:25.098563 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:44:36 crc kubenswrapper[4565]: I1125 09:44:36.097462 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:44:36 crc kubenswrapper[4565]: E1125 09:44:36.098479 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:44:47 crc kubenswrapper[4565]: I1125 09:44:47.101805 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:44:47 crc kubenswrapper[4565]: E1125 09:44:47.102661 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.179054 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m"] Nov 25 09:45:00 crc kubenswrapper[4565]: E1125 09:45:00.180226 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="registry-server" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.180242 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="registry-server" Nov 25 09:45:00 crc kubenswrapper[4565]: E1125 09:45:00.180260 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="extract-content" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.180266 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="extract-content" Nov 25 09:45:00 crc kubenswrapper[4565]: E1125 09:45:00.180282 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="extract-utilities" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.180289 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="extract-utilities" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.180610 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a35f9c7-0f29-4ab3-98ef-463d88be4a17" containerName="registry-server" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.181678 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.186456 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.186755 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.203592 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m"] Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.265505 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrr7w\" (UniqueName: \"kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.265589 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.265891 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.368113 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.368350 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrr7w\" (UniqueName: \"kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.368463 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.369254 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.376498 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.386675 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrr7w\" (UniqueName: \"kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w\") pod \"collect-profiles-29401065-cpr8m\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.507489 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:00 crc kubenswrapper[4565]: I1125 09:45:00.930682 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m"] Nov 25 09:45:01 crc kubenswrapper[4565]: I1125 09:45:01.101876 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:45:01 crc kubenswrapper[4565]: E1125 09:45:01.102394 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:45:01 crc kubenswrapper[4565]: I1125 09:45:01.254015 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" event={"ID":"fb135eff-e058-4340-8c19-2b5170898e24","Type":"ContainerStarted","Data":"a3cee72e48a54833e6eac3e1b3b763251c2f98dc0c56f28da878ed957f48315b"} Nov 25 09:45:01 crc kubenswrapper[4565]: I1125 09:45:01.254060 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" event={"ID":"fb135eff-e058-4340-8c19-2b5170898e24","Type":"ContainerStarted","Data":"e6160b8e3e1dc079a95ce7c03c176472e319466d6885ac192e9b8f6912969507"} Nov 25 09:45:01 crc kubenswrapper[4565]: I1125 09:45:01.281019 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" podStartSLOduration=1.281006922 podStartE2EDuration="1.281006922s" podCreationTimestamp="2025-11-25 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:45:01.276623057 +0000 UTC m=+2434.479118195" watchObservedRunningTime="2025-11-25 09:45:01.281006922 +0000 UTC m=+2434.483502060" Nov 25 09:45:02 crc kubenswrapper[4565]: I1125 09:45:02.266258 4565 generic.go:334] "Generic (PLEG): container finished" podID="fb135eff-e058-4340-8c19-2b5170898e24" containerID="a3cee72e48a54833e6eac3e1b3b763251c2f98dc0c56f28da878ed957f48315b" exitCode=0 Nov 25 09:45:02 crc kubenswrapper[4565]: I1125 09:45:02.266602 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" event={"ID":"fb135eff-e058-4340-8c19-2b5170898e24","Type":"ContainerDied","Data":"a3cee72e48a54833e6eac3e1b3b763251c2f98dc0c56f28da878ed957f48315b"} Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.661770 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.850974 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrr7w\" (UniqueName: \"kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w\") pod \"fb135eff-e058-4340-8c19-2b5170898e24\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.851677 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume\") pod \"fb135eff-e058-4340-8c19-2b5170898e24\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.851811 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume\") pod \"fb135eff-e058-4340-8c19-2b5170898e24\" (UID: \"fb135eff-e058-4340-8c19-2b5170898e24\") " Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.852755 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb135eff-e058-4340-8c19-2b5170898e24" (UID: "fb135eff-e058-4340-8c19-2b5170898e24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.858573 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w" (OuterVolumeSpecName: "kube-api-access-hrr7w") pod "fb135eff-e058-4340-8c19-2b5170898e24" (UID: "fb135eff-e058-4340-8c19-2b5170898e24"). InnerVolumeSpecName "kube-api-access-hrr7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.859461 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb135eff-e058-4340-8c19-2b5170898e24" (UID: "fb135eff-e058-4340-8c19-2b5170898e24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.954706 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrr7w\" (UniqueName: \"kubernetes.io/projected/fb135eff-e058-4340-8c19-2b5170898e24-kube-api-access-hrr7w\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.954755 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb135eff-e058-4340-8c19-2b5170898e24-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:03 crc kubenswrapper[4565]: I1125 09:45:03.954770 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb135eff-e058-4340-8c19-2b5170898e24-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:04 crc kubenswrapper[4565]: I1125 09:45:04.289511 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" event={"ID":"fb135eff-e058-4340-8c19-2b5170898e24","Type":"ContainerDied","Data":"e6160b8e3e1dc079a95ce7c03c176472e319466d6885ac192e9b8f6912969507"} Nov 25 09:45:04 crc kubenswrapper[4565]: I1125 09:45:04.289578 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6160b8e3e1dc079a95ce7c03c176472e319466d6885ac192e9b8f6912969507" Nov 25 09:45:04 crc kubenswrapper[4565]: I1125 09:45:04.289673 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401065-cpr8m" Nov 25 09:45:04 crc kubenswrapper[4565]: I1125 09:45:04.731820 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8"] Nov 25 09:45:04 crc kubenswrapper[4565]: I1125 09:45:04.746876 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401020-4vwj8"] Nov 25 09:45:05 crc kubenswrapper[4565]: I1125 09:45:05.110552 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a888c987-e97d-4f33-9932-158161870fe6" path="/var/lib/kubelet/pods/a888c987-e97d-4f33-9932-158161870fe6/volumes" Nov 25 09:45:15 crc kubenswrapper[4565]: I1125 09:45:15.097361 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:45:15 crc kubenswrapper[4565]: E1125 09:45:15.098077 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:45:30 crc kubenswrapper[4565]: I1125 09:45:30.097619 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:45:30 crc kubenswrapper[4565]: E1125 09:45:30.098512 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:45:36 crc kubenswrapper[4565]: I1125 09:45:36.824269 4565 scope.go:117] "RemoveContainer" containerID="0cef7b14ef30b0eb4abaff2d20167e00b58f773da50f35150e7ca1b544ff4265" Nov 25 09:45:41 crc kubenswrapper[4565]: I1125 09:45:41.097663 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:45:41 crc kubenswrapper[4565]: E1125 09:45:41.098826 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:45:47 crc kubenswrapper[4565]: I1125 09:45:47.666616 4565 generic.go:334] "Generic (PLEG): container finished" podID="e1061e52-8553-4932-9689-83016e2b413f" containerID="2b8a742001a8a88e227e92afd5a2522236bc40c43b37e1dca2c96ad55d7ceb45" exitCode=0 Nov 25 09:45:47 crc kubenswrapper[4565]: I1125 09:45:47.666705 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" event={"ID":"e1061e52-8553-4932-9689-83016e2b413f","Type":"ContainerDied","Data":"2b8a742001a8a88e227e92afd5a2522236bc40c43b37e1dca2c96ad55d7ceb45"} Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.120276 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.162734 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.162792 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.162826 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.162870 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd5tz\" (UniqueName: \"kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.163002 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.164224 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key\") pod \"e1061e52-8553-4932-9689-83016e2b413f\" (UID: \"e1061e52-8553-4932-9689-83016e2b413f\") " Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.170123 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.171231 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph" (OuterVolumeSpecName: "ceph") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.171596 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz" (OuterVolumeSpecName: "kube-api-access-vd5tz") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "kube-api-access-vd5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.191224 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.194134 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.198699 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory" (OuterVolumeSpecName: "inventory") pod "e1061e52-8553-4932-9689-83016e2b413f" (UID: "e1061e52-8553-4932-9689-83016e2b413f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.267652 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.268030 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.268043 4565 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.268064 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd5tz\" (UniqueName: \"kubernetes.io/projected/e1061e52-8553-4932-9689-83016e2b413f-kube-api-access-vd5tz\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.268082 4565 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.268097 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1061e52-8553-4932-9689-83016e2b413f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.689057 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" event={"ID":"e1061e52-8553-4932-9689-83016e2b413f","Type":"ContainerDied","Data":"52e5a866e74c6a06b962511c4eea56dbab35a8c30063d868b7bafb24278a4c32"} Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.689116 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52e5a866e74c6a06b962511c4eea56dbab35a8c30063d868b7bafb24278a4c32" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.689145 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xqn99" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.856388 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm"] Nov 25 09:45:49 crc kubenswrapper[4565]: E1125 09:45:49.856984 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1061e52-8553-4932-9689-83016e2b413f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.857006 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1061e52-8553-4932-9689-83016e2b413f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 09:45:49 crc kubenswrapper[4565]: E1125 09:45:49.857040 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb135eff-e058-4340-8c19-2b5170898e24" containerName="collect-profiles" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.857048 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb135eff-e058-4340-8c19-2b5170898e24" containerName="collect-profiles" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.857249 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb135eff-e058-4340-8c19-2b5170898e24" containerName="collect-profiles" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.857266 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1061e52-8553-4932-9689-83016e2b413f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.858065 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.862796 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47wnc" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.862845 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.862884 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.863017 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.863146 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.863139 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.865100 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.865108 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.865408 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.870660 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm"] Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887091 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887140 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887171 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887199 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887219 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwr9\" (UniqueName: \"kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887258 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887283 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887328 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887351 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887366 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.887380 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.988492 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.988637 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989260 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989304 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989334 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989366 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989402 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989431 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989459 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989493 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwr9\" (UniqueName: \"kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.989527 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.991103 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.991379 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.994633 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.996233 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.997440 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.997454 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:49 crc kubenswrapper[4565]: I1125 09:45:49.998552 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.001396 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.002684 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.013538 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.017076 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwr9\" (UniqueName: \"kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.186195 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.828432 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm"] Nov 25 09:45:50 crc kubenswrapper[4565]: I1125 09:45:50.846281 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:45:51 crc kubenswrapper[4565]: I1125 09:45:51.704990 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" event={"ID":"769230ff-fe55-4c62-bc60-73797b5fc1bb","Type":"ContainerStarted","Data":"27e0a5be3037c462d19b06bc4383d39a44632cc52a57f870f9c30889adbda1b4"} Nov 25 09:45:51 crc kubenswrapper[4565]: I1125 09:45:51.705268 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" event={"ID":"769230ff-fe55-4c62-bc60-73797b5fc1bb","Type":"ContainerStarted","Data":"f6d73f77697b5f01f6f968338d90ba7bf512d39450c48c185320b4f6c5453c6a"} Nov 25 09:45:51 crc kubenswrapper[4565]: I1125 09:45:51.725175 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" podStartSLOduration=2.136808175 podStartE2EDuration="2.725161723s" podCreationTimestamp="2025-11-25 09:45:49 +0000 UTC" firstStartedPulling="2025-11-25 09:45:50.845959966 +0000 UTC m=+2484.048455104" lastFinishedPulling="2025-11-25 09:45:51.434313514 +0000 UTC m=+2484.636808652" observedRunningTime="2025-11-25 09:45:51.720519341 +0000 UTC m=+2484.923014479" watchObservedRunningTime="2025-11-25 09:45:51.725161723 +0000 UTC m=+2484.927656860" Nov 25 09:45:55 crc kubenswrapper[4565]: I1125 09:45:55.097863 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:45:55 crc kubenswrapper[4565]: E1125 09:45:55.098979 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:46:08 crc kubenswrapper[4565]: I1125 09:46:08.098092 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:46:08 crc kubenswrapper[4565]: I1125 09:46:08.899853 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899"} Nov 25 09:48:16 crc kubenswrapper[4565]: I1125 09:48:16.107657 4565 generic.go:334] "Generic (PLEG): container finished" podID="769230ff-fe55-4c62-bc60-73797b5fc1bb" containerID="27e0a5be3037c462d19b06bc4383d39a44632cc52a57f870f9c30889adbda1b4" exitCode=0 Nov 25 09:48:16 crc kubenswrapper[4565]: I1125 09:48:16.107771 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" event={"ID":"769230ff-fe55-4c62-bc60-73797b5fc1bb","Type":"ContainerDied","Data":"27e0a5be3037c462d19b06bc4383d39a44632cc52a57f870f9c30889adbda1b4"} Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.482153 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.592618 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.592791 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.592923 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593012 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593064 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593089 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593113 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593185 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593232 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593265 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.593311 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwr9\" (UniqueName: \"kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9\") pod \"769230ff-fe55-4c62-bc60-73797b5fc1bb\" (UID: \"769230ff-fe55-4c62-bc60-73797b5fc1bb\") " Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.600123 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9" (OuterVolumeSpecName: "kube-api-access-zkwr9") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "kube-api-access-zkwr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.603716 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph" (OuterVolumeSpecName: "ceph") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.618471 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.630028 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.630161 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.630258 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.631644 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.633264 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.639952 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.647500 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory" (OuterVolumeSpecName: "inventory") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.656568 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "769230ff-fe55-4c62-bc60-73797b5fc1bb" (UID: "769230ff-fe55-4c62-bc60-73797b5fc1bb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697114 4565 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697147 4565 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697160 4565 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697176 4565 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697191 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697200 4565 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697209 4565 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697220 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697228 4565 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/769230ff-fe55-4c62-bc60-73797b5fc1bb-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697238 4565 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/769230ff-fe55-4c62-bc60-73797b5fc1bb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:17 crc kubenswrapper[4565]: I1125 09:48:17.697247 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwr9\" (UniqueName: \"kubernetes.io/projected/769230ff-fe55-4c62-bc60-73797b5fc1bb-kube-api-access-zkwr9\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:18 crc kubenswrapper[4565]: I1125 09:48:18.128864 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" event={"ID":"769230ff-fe55-4c62-bc60-73797b5fc1bb","Type":"ContainerDied","Data":"f6d73f77697b5f01f6f968338d90ba7bf512d39450c48c185320b4f6c5453c6a"} Nov 25 09:48:18 crc kubenswrapper[4565]: I1125 09:48:18.128888 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm" Nov 25 09:48:18 crc kubenswrapper[4565]: I1125 09:48:18.128901 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d73f77697b5f01f6f968338d90ba7bf512d39450c48c185320b4f6c5453c6a" Nov 25 09:48:25 crc kubenswrapper[4565]: I1125 09:48:25.099498 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:48:25 crc kubenswrapper[4565]: I1125 09:48:25.100859 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.263937 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 25 09:48:30 crc kubenswrapper[4565]: E1125 09:48:30.264810 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769230ff-fe55-4c62-bc60-73797b5fc1bb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.264833 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="769230ff-fe55-4c62-bc60-73797b5fc1bb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.268379 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="769230ff-fe55-4c62-bc60-73797b5fc1bb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.269517 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.273023 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.273622 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.282938 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.286161 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.289368 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.299789 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.320511 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364804 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-run\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364863 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-ceph\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364888 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364921 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364966 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.364981 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365004 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365025 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365044 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365065 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-sys\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365091 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365114 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365137 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365159 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365185 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89dbp\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-kube-api-access-89dbp\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365207 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365227 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pbll\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-kube-api-access-6pbll\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365255 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365277 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365299 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365327 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365355 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365382 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-dev\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365421 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365441 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365465 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365492 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365512 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365530 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-scripts\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365548 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365565 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.365601 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-lib-modules\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467685 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467746 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467774 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467798 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467825 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467846 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467879 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-sys\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467916 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467959 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467989 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468015 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468021 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467958 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468139 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.467990 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-sys\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468211 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468613 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468687 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89dbp\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-kube-api-access-89dbp\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.468855 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.469430 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pbll\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-kube-api-access-6pbll\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.469906 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470039 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470127 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470260 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470371 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470495 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-dev\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470716 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470807 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470893 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.470997 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471073 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471157 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-scripts\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471237 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471308 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471487 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-lib-modules\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471611 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-run\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471701 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-ceph\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.471769 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.472183 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.469012 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473318 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-run\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473376 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-lib-modules\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473432 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473509 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.469109 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473588 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.469123 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473645 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473686 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.473712 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.474844 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d40080b6-5cb6-48e6-9625-9c8b821ed10b-dev\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.476452 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.477523 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.479550 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.479616 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b9c251-0777-4463-916e-b6712e7a69b7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.482315 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-ceph\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.482677 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-scripts\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.483729 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.485353 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.485888 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.490665 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89dbp\" (UniqueName: \"kubernetes.io/projected/d3b9c251-0777-4463-916e-b6712e7a69b7-kube-api-access-89dbp\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.491511 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3b9c251-0777-4463-916e-b6712e7a69b7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3b9c251-0777-4463-916e-b6712e7a69b7\") " pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.492419 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40080b6-5cb6-48e6-9625-9c8b821ed10b-config-data\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.504409 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pbll\" (UniqueName: \"kubernetes.io/projected/d40080b6-5cb6-48e6-9625-9c8b821ed10b-kube-api-access-6pbll\") pod \"cinder-backup-0\" (UID: \"d40080b6-5cb6-48e6-9625-9c8b821ed10b\") " pod="openstack/cinder-backup-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.589568 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:30 crc kubenswrapper[4565]: I1125 09:48:30.600131 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.019259 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-lcpln"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.020734 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.037060 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lcpln"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.075759 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4214-account-create-ftd48"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.081433 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.082999 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.087580 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.092447 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8rn\" (UniqueName: \"kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.125987 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4214-account-create-ftd48"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.138048 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.139276 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.142515 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.142877 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-zgvl6" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.146415 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.146939 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.195487 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.196908 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.197825 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200354 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200481 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdwb\" (UniqueName: \"kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200593 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8rn\" (UniqueName: \"kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200695 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200769 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.200843 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.201024 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.201138 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhv8\" (UniqueName: \"kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.201227 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.202744 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j54bw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.204371 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.211150 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.211402 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.217041 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.225588 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8rn\" (UniqueName: \"kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn\") pod \"manila-db-create-lcpln\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.226464 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.285949 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.303691 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305140 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdwb\" (UniqueName: \"kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305308 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305414 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305500 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305698 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.305807 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhv8\" (UniqueName: \"kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.319184 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.320075 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.320722 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.335196 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.338535 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.343540 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdwb\" (UniqueName: \"kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb\") pod \"manila-4214-account-create-ftd48\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.350483 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.352603 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhv8\" (UniqueName: \"kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.352773 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.380444 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key\") pod \"horizon-5bbcb89d85-28rqw\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.384996 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.390810 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lcpln" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.407965 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: E1125 09:48:31.412619 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-8k7wm logs scripts], unattached volumes=[], failed to process volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-8k7wm logs scripts]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414046 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzglt\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414096 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414142 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414562 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414708 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414813 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414837 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414853 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.414870 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.416882 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.445001 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: E1125 09:48:31.445842 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-jzglt logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="2237351d-d594-4ff6-beb7-dcb02aae0366" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.450387 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.452074 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.464324 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.465903 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518357 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518413 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7wm\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518455 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzglt\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518482 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518512 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518533 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518575 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518595 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518617 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518636 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518733 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518754 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518772 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518798 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518814 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4hds\" (UniqueName: \"kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518853 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518868 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518883 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518898 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518922 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518953 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518978 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.518995 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.519735 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.519732 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.520274 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.529324 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.532378 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.535963 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.553683 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.558055 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzglt\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.572840 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.575610 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.595783 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626039 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626073 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626109 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626130 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626157 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626179 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7wm\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626248 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626315 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626338 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626358 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626385 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626403 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626436 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.626452 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4hds\" (UniqueName: \"kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.627064 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.633374 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.635475 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.638857 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.640282 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.643409 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.655523 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7wm\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.665029 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.671489 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4hds\" (UniqueName: \"kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.674496 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.693906 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.711187 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.711485 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.711750 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key\") pod \"horizon-67bbcd57bf-nszbm\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.715203 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.733019 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.770768 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:31 crc kubenswrapper[4565]: W1125 09:48:31.996428 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bc64fad_0ded_4d44_a53b_cdd2f5cebc45.slice/crio-7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700 WatchSource:0}: Error finding container 7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700: Status 404 returned error can't find the container with id 7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700 Nov 25 09:48:31 crc kubenswrapper[4565]: I1125 09:48:31.997577 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lcpln"] Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.100140 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4214-account-create-ftd48"] Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.220777 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:48:32 crc kubenswrapper[4565]: W1125 09:48:32.230706 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd8b0555_5ea6_4fae_b234_7f7661406dd1.slice/crio-5d0224d113bfffd0e85692de07c0f00e962764f95332a92bd03c4239ce0d852b WatchSource:0}: Error finding container 5d0224d113bfffd0e85692de07c0f00e962764f95332a92bd03c4239ce0d852b: Status 404 returned error can't find the container with id 5d0224d113bfffd0e85692de07c0f00e962764f95332a92bd03c4239ce0d852b Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.299844 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lcpln" event={"ID":"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45","Type":"ContainerStarted","Data":"b581befa176a1d323f49a2b4397a4188801e223001a1211519342157891532f1"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.299890 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lcpln" event={"ID":"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45","Type":"ContainerStarted","Data":"7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.311845 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d40080b6-5cb6-48e6-9625-9c8b821ed10b","Type":"ContainerStarted","Data":"9e3084e6b86301c5706086b1244e4829d9362316f20bf6400b3af7bea174a24c"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.319178 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.327042 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerStarted","Data":"5d0224d113bfffd0e85692de07c0f00e962764f95332a92bd03c4239ce0d852b"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.329115 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3b9c251-0777-4463-916e-b6712e7a69b7","Type":"ContainerStarted","Data":"07e50e0a9c68849b8e2dedb91c9a7e2059b81e6e58cb84dbcf61466a171d0156"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.336356 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4214-account-create-ftd48" event={"ID":"427c19d8-2ea1-4831-8591-8b8d52eb83dd","Type":"ContainerStarted","Data":"2edb58051f1aca678a1c555d7306208b575dd8c28e498fccc5dff2851d2e5c93"} Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.336375 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.336422 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.344841 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-lcpln" podStartSLOduration=2.344829106 podStartE2EDuration="2.344829106s" podCreationTimestamp="2025-11-25 09:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:48:32.321706846 +0000 UTC m=+2645.524201984" watchObservedRunningTime="2025-11-25 09:48:32.344829106 +0000 UTC m=+2645.547324244" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.361382 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.373912 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.382227 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-4214-account-create-ftd48" podStartSLOduration=1.38221104 podStartE2EDuration="1.38221104s" podCreationTimestamp="2025-11-25 09:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:48:32.353749964 +0000 UTC m=+2645.556245092" watchObservedRunningTime="2025-11-25 09:48:32.38221104 +0000 UTC m=+2645.584706179" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448232 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448277 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k7wm\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448335 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448375 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448544 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448678 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.448713 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.449120 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs" (OuterVolumeSpecName: "logs") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.449992 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.450073 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph\") pod \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\" (UID: \"8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.450781 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.451272 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.461154 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.461232 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.463100 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data" (OuterVolumeSpecName: "config-data") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.463149 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.464506 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts" (OuterVolumeSpecName: "scripts") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.466253 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm" (OuterVolumeSpecName: "kube-api-access-8k7wm") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "kube-api-access-8k7wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.470142 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph" (OuterVolumeSpecName: "ceph") pod "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" (UID: "8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552244 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552506 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs" (OuterVolumeSpecName: "logs") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552530 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzglt\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552608 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552685 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552720 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552754 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552793 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552827 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.552849 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs\") pod \"2237351d-d594-4ff6-beb7-dcb02aae0366\" (UID: \"2237351d-d594-4ff6-beb7-dcb02aae0366\") " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553637 4565 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553659 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553670 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553679 4565 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553699 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553708 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553716 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553726 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k7wm\" (UniqueName: \"kubernetes.io/projected/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-kube-api-access-8k7wm\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.553736 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.556360 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.556841 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt" (OuterVolumeSpecName: "kube-api-access-jzglt") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "kube-api-access-jzglt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.560604 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.560704 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts" (OuterVolumeSpecName: "scripts") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.561351 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.561628 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data" (OuterVolumeSpecName: "config-data") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.565982 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.566119 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph" (OuterVolumeSpecName: "ceph") pod "2237351d-d594-4ff6-beb7-dcb02aae0366" (UID: "2237351d-d594-4ff6-beb7-dcb02aae0366"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.586805 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656169 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656203 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656215 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656231 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656242 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656251 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656258 4565 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2237351d-d594-4ff6-beb7-dcb02aae0366-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656267 4565 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2237351d-d594-4ff6-beb7-dcb02aae0366-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.656275 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzglt\" (UniqueName: \"kubernetes.io/projected/2237351d-d594-4ff6-beb7-dcb02aae0366-kube-api-access-jzglt\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.677134 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 25 09:48:32 crc kubenswrapper[4565]: I1125 09:48:32.767690 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.350305 4565 generic.go:334] "Generic (PLEG): container finished" podID="427c19d8-2ea1-4831-8591-8b8d52eb83dd" containerID="781d16635c862d212f8d2bf238daeac708acc52a210abfc018f30477cee35469" exitCode=0 Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.350611 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4214-account-create-ftd48" event={"ID":"427c19d8-2ea1-4831-8591-8b8d52eb83dd","Type":"ContainerDied","Data":"781d16635c862d212f8d2bf238daeac708acc52a210abfc018f30477cee35469"} Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.352267 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerStarted","Data":"c04d2084cd8ae19fa33f7a3e724cfd9a7dbfe411f05e5a1067da35fe4515fe2b"} Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.354250 4565 generic.go:334] "Generic (PLEG): container finished" podID="6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" containerID="b581befa176a1d323f49a2b4397a4188801e223001a1211519342157891532f1" exitCode=0 Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.354377 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lcpln" event={"ID":"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45","Type":"ContainerDied","Data":"b581befa176a1d323f49a2b4397a4188801e223001a1211519342157891532f1"} Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.355583 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d40080b6-5cb6-48e6-9625-9c8b821ed10b","Type":"ContainerStarted","Data":"afcc9c3f363b2e395a5662f439623648fbed3721b3ae9451e9d8687444ad38d4"} Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.357598 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3b9c251-0777-4463-916e-b6712e7a69b7","Type":"ContainerStarted","Data":"dc3cb81c9b0167284846e7ce3bbe364634018e55192a91832900803697e4297a"} Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.357662 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.357765 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.455691 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.487815 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.500972 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.502638 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.508059 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j54bw" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.508251 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.508504 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.509795 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.524129 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.547059 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.587845 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.606527 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.609718 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.612356 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.614561 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.674251 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698593 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698694 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698766 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44kx\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698827 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698902 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.698986 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.699032 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.699094 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.699247 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.702047 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.734197 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.736247 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.744353 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.772878 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801374 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801477 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801506 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801544 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801590 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44kx\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801623 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801665 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801694 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801738 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801781 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801823 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801854 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801879 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801921 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.801957 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frkjg\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.802022 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.802048 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.802072 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.802566 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.802798 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.825467 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.833397 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.833673 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.840055 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.841895 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.881168 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.895750 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44kx\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.896645 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:33 crc kubenswrapper[4565]: E1125 09:48:33.897853 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="e855fc5f-cc7c-4beb-afde-b326f6bcc33b" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904236 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904284 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgqb\" (UniqueName: \"kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904336 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904360 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904396 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904423 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904443 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904531 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904580 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904603 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frkjg\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904624 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904677 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904701 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904745 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.904805 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.905296 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.905624 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.913193 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.973502 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.974297 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.975127 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.983815 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:33 crc kubenswrapper[4565]: I1125 09:48:33.987386 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.023812 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024043 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024102 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024242 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgqb\" (UniqueName: \"kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024331 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024380 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.024421 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.035352 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.036396 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.036812 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.037447 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.048434 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.059412 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgqb\" (UniqueName: \"kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.071411 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.073629 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frkjg\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.081644 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.088508 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6549bb6ccb-qd7ll"] Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.100785 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle\") pod \"horizon-6848d4c5cd-8fv74\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.100944 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.126016 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.151078 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.152138 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.181788 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6549bb6ccb-qd7ll"] Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230586 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-scripts\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230673 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-config-data\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230777 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-secret-key\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230841 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-tls-certs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230870 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-combined-ca-bundle\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230896 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-logs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.230967 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqlr5\" (UniqueName: \"kubernetes.io/projected/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-kube-api-access-xqlr5\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.332720 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-tls-certs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.332784 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-combined-ca-bundle\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.332818 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-logs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.332899 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqlr5\" (UniqueName: \"kubernetes.io/projected/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-kube-api-access-xqlr5\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.332976 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-scripts\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.333037 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-config-data\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.333544 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-logs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.334032 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-scripts\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.334325 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-secret-key\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.335556 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-config-data\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.337544 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-tls-certs\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.339861 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-horizon-secret-key\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.342455 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-combined-ca-bundle\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.360485 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.378562 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqlr5\" (UniqueName: \"kubernetes.io/projected/d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb-kube-api-access-xqlr5\") pod \"horizon-6549bb6ccb-qd7ll\" (UID: \"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb\") " pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.409961 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d40080b6-5cb6-48e6-9625-9c8b821ed10b","Type":"ContainerStarted","Data":"adcf5db5502a596cbc7c413f0b26b1a5b838c9b50fd928487ae0cc573eca6a9f"} Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.422586 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3b9c251-0777-4463-916e-b6712e7a69b7","Type":"ContainerStarted","Data":"688fb01b33c8c495f7ea86a150887db36a64f210ff17d92639f90a97899f205a"} Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.422695 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.442151 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.448103 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.502024 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.089822429 podStartE2EDuration="4.502000904s" podCreationTimestamp="2025-11-25 09:48:30 +0000 UTC" firstStartedPulling="2025-11-25 09:48:31.638685055 +0000 UTC m=+2644.841180184" lastFinishedPulling="2025-11-25 09:48:33.050863521 +0000 UTC m=+2646.253358659" observedRunningTime="2025-11-25 09:48:34.463985455 +0000 UTC m=+2647.666480584" watchObservedRunningTime="2025-11-25 09:48:34.502000904 +0000 UTC m=+2647.704496041" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.539731 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.174448658 podStartE2EDuration="4.539708993s" podCreationTimestamp="2025-11-25 09:48:30 +0000 UTC" firstStartedPulling="2025-11-25 09:48:31.715107981 +0000 UTC m=+2644.917603118" lastFinishedPulling="2025-11-25 09:48:33.080368315 +0000 UTC m=+2646.282863453" observedRunningTime="2025-11-25 09:48:34.52735925 +0000 UTC m=+2647.729854388" watchObservedRunningTime="2025-11-25 09:48:34.539708993 +0000 UTC m=+2647.742204130" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567161 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d44kx\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567624 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567717 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567792 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567894 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.567980 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.568025 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.568080 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.568119 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph\") pod \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\" (UID: \"e855fc5f-cc7c-4beb-afde-b326f6bcc33b\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.585726 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.592320 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs" (OuterVolumeSpecName: "logs") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.603497 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.603615 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts" (OuterVolumeSpecName: "scripts") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.603688 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.603763 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph" (OuterVolumeSpecName: "ceph") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.608626 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.608871 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx" (OuterVolumeSpecName: "kube-api-access-d44kx") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "kube-api-access-d44kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.610072 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data" (OuterVolumeSpecName: "config-data") pod "e855fc5f-cc7c-4beb-afde-b326f6bcc33b" (UID: "e855fc5f-cc7c-4beb-afde-b326f6bcc33b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680742 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d44kx\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-kube-api-access-d44kx\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680772 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680948 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680971 4565 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680981 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680991 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.680999 4565 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.681007 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.681016 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e855fc5f-cc7c-4beb-afde-b326f6bcc33b-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.721896 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.783488 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.970185 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lcpln" Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.991533 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b8rn\" (UniqueName: \"kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn\") pod \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.991580 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts\") pod \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\" (UID: \"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45\") " Nov 25 09:48:34 crc kubenswrapper[4565]: I1125 09:48:34.992508 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" (UID: "6bc64fad-0ded-4d44-a53b-cdd2f5cebc45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.006903 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn" (OuterVolumeSpecName: "kube-api-access-6b8rn") pod "6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" (UID: "6bc64fad-0ded-4d44-a53b-cdd2f5cebc45"). InnerVolumeSpecName "kube-api-access-6b8rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.044331 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.093869 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts\") pod \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.094104 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmdwb\" (UniqueName: \"kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb\") pod \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\" (UID: \"427c19d8-2ea1-4831-8591-8b8d52eb83dd\") " Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.094773 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b8rn\" (UniqueName: \"kubernetes.io/projected/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-kube-api-access-6b8rn\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.094797 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.095783 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "427c19d8-2ea1-4831-8591-8b8d52eb83dd" (UID: "427c19d8-2ea1-4831-8591-8b8d52eb83dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.108084 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb" (OuterVolumeSpecName: "kube-api-access-cmdwb") pod "427c19d8-2ea1-4831-8591-8b8d52eb83dd" (UID: "427c19d8-2ea1-4831-8591-8b8d52eb83dd"). InnerVolumeSpecName "kube-api-access-cmdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.116068 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2237351d-d594-4ff6-beb7-dcb02aae0366" path="/var/lib/kubelet/pods/2237351d-d594-4ff6-beb7-dcb02aae0366/volumes" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.116553 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde" path="/var/lib/kubelet/pods/8294d1b8-3c07-4e2f-a6d1-5a251f6a4dde/volumes" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.199685 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmdwb\" (UniqueName: \"kubernetes.io/projected/427c19d8-2ea1-4831-8591-8b8d52eb83dd-kube-api-access-cmdwb\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.199730 4565 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427c19d8-2ea1-4831-8591-8b8d52eb83dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.253232 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.358635 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6549bb6ccb-qd7ll"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.372787 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.433508 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerStarted","Data":"37c3dc2a69d6edbc8600eb25c60b3f8b803a590a67a38f3dcf74c28b21357123"} Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.436476 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lcpln" event={"ID":"6bc64fad-0ded-4d44-a53b-cdd2f5cebc45","Type":"ContainerDied","Data":"7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700"} Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.436505 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f53899380e904b7e81443e5a5093307c6e4b140af8c3ca900dad756ea94d700" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.436563 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lcpln" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.438289 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerStarted","Data":"5dc62f209a1df6c1de927d7cfad92b367845c0051710d761b902112d26e973e4"} Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.444101 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549bb6ccb-qd7ll" event={"ID":"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb","Type":"ContainerStarted","Data":"3a36861ec25c28da85fc3321540aeb0ce7bcc6193d56b164a6a3e38ce3ddb5c6"} Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.446118 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.446607 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4214-account-create-ftd48" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.446991 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4214-account-create-ftd48" event={"ID":"427c19d8-2ea1-4831-8591-8b8d52eb83dd","Type":"ContainerDied","Data":"2edb58051f1aca678a1c555d7306208b575dd8c28e498fccc5dff2851d2e5c93"} Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.447013 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2edb58051f1aca678a1c555d7306208b575dd8c28e498fccc5dff2851d2e5c93" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.523009 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.533993 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.539827 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:35 crc kubenswrapper[4565]: E1125 09:48:35.540300 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c19d8-2ea1-4831-8591-8b8d52eb83dd" containerName="mariadb-account-create" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.540314 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c19d8-2ea1-4831-8591-8b8d52eb83dd" containerName="mariadb-account-create" Nov 25 09:48:35 crc kubenswrapper[4565]: E1125 09:48:35.540341 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" containerName="mariadb-database-create" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.540347 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" containerName="mariadb-database-create" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.540540 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="427c19d8-2ea1-4831-8591-8b8d52eb83dd" containerName="mariadb-account-create" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.540565 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" containerName="mariadb-database-create" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.541599 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.547744 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.549245 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.555137 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.590294 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.602061 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.710898 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711006 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711041 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711091 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmmd\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-kube-api-access-phmmd\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711117 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-ceph\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711201 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711218 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711245 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.711264 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-logs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814589 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814661 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814700 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814745 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmmd\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-kube-api-access-phmmd\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814772 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-ceph\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814847 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814866 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814895 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.814917 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-logs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.815469 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-logs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.815728 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/451d4f5d-1ecc-4633-a889-ea95473bc981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.817378 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.837738 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.838620 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.838673 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-ceph\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.839948 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.841573 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmmd\" (UniqueName: \"kubernetes.io/projected/451d4f5d-1ecc-4633-a889-ea95473bc981-kube-api-access-phmmd\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.870234 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451d4f5d-1ecc-4633-a889-ea95473bc981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:35 crc kubenswrapper[4565]: I1125 09:48:35.885075 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"451d4f5d-1ecc-4633-a889-ea95473bc981\") " pod="openstack/glance-default-internal-api-0" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.162975 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.483353 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-r5rpm"] Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.485033 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.487811 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4s7rx" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.487853 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.496814 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r5rpm"] Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.544270 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerStarted","Data":"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab"} Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.567900 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.568005 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.568213 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k68d\" (UniqueName: \"kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.568465 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.672511 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k68d\" (UniqueName: \"kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.672664 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.672723 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.672793 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.688983 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.689452 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.689667 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.703336 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k68d\" (UniqueName: \"kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d\") pod \"manila-db-sync-r5rpm\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.850919 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:36 crc kubenswrapper[4565]: I1125 09:48:36.975123 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.113903 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e855fc5f-cc7c-4beb-afde-b326f6bcc33b" path="/var/lib/kubelet/pods/e855fc5f-cc7c-4beb-afde-b326f6bcc33b/volumes" Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.482888 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r5rpm"] Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.592527 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r5rpm" event={"ID":"b64f1fae-2be7-4ebd-a561-58884c10b4e6","Type":"ContainerStarted","Data":"87408143fc2381057d58ab1321cb8cf94927e2b9d16fb7de83de4e7fd3dd25c1"} Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.608764 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"451d4f5d-1ecc-4633-a889-ea95473bc981","Type":"ContainerStarted","Data":"92db744e57cec3a7e358a7dc029540eaee9a1f6e93865b5b7dfd0c843c43a32d"} Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.622145 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerStarted","Data":"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4"} Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.622327 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-log" containerID="cri-o://46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" gracePeriod=30 Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.622881 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-httpd" containerID="cri-o://7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" gracePeriod=30 Nov 25 09:48:37 crc kubenswrapper[4565]: I1125 09:48:37.668861 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.668840317 podStartE2EDuration="4.668840317s" podCreationTimestamp="2025-11-25 09:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:48:37.65475368 +0000 UTC m=+2650.857248818" watchObservedRunningTime="2025-11-25 09:48:37.668840317 +0000 UTC m=+2650.871335454" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.293522 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332319 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332393 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332429 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332495 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332519 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332672 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332770 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332823 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frkjg\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.332961 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph\") pod \"4c58926f-d674-4eae-a37f-ffa9642e693a\" (UID: \"4c58926f-d674-4eae-a37f-ffa9642e693a\") " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.334176 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs" (OuterVolumeSpecName: "logs") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.334685 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.335561 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.335584 4565 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c58926f-d674-4eae-a37f-ffa9642e693a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.344233 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts" (OuterVolumeSpecName: "scripts") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.346152 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg" (OuterVolumeSpecName: "kube-api-access-frkjg") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "kube-api-access-frkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.355955 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.359876 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph" (OuterVolumeSpecName: "ceph") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.398264 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.441491 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.441583 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frkjg\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-kube-api-access-frkjg\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.441645 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c58926f-d674-4eae-a37f-ffa9642e693a-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.441737 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.441802 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.448288 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data" (OuterVolumeSpecName: "config-data") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.457191 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c58926f-d674-4eae-a37f-ffa9642e693a" (UID: "4c58926f-d674-4eae-a37f-ffa9642e693a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.480074 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.544778 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.545213 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.545280 4565 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c58926f-d674-4eae-a37f-ffa9642e693a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643548 4565 generic.go:334] "Generic (PLEG): container finished" podID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerID="7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" exitCode=143 Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643589 4565 generic.go:334] "Generic (PLEG): container finished" podID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerID="46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" exitCode=143 Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643663 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerDied","Data":"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4"} Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643700 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerDied","Data":"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab"} Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643712 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c58926f-d674-4eae-a37f-ffa9642e693a","Type":"ContainerDied","Data":"5dc62f209a1df6c1de927d7cfad92b367845c0051710d761b902112d26e973e4"} Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643708 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.643732 4565 scope.go:117] "RemoveContainer" containerID="7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.653175 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"451d4f5d-1ecc-4633-a889-ea95473bc981","Type":"ContainerStarted","Data":"f951b394163960b828aff2547eadc6f5d5cbe429b77654e5d3914963152b1f2a"} Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.688554 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.719721 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.761431 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:38 crc kubenswrapper[4565]: E1125 09:48:38.761877 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-httpd" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.761893 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-httpd" Nov 25 09:48:38 crc kubenswrapper[4565]: E1125 09:48:38.761922 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-log" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.761945 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-log" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.762158 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-httpd" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.762172 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" containerName="glance-log" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.763241 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.767863 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.768182 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.771572 4565 scope.go:117] "RemoveContainer" containerID="46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.799334 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.824026 4565 scope.go:117] "RemoveContainer" containerID="7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" Nov 25 09:48:38 crc kubenswrapper[4565]: E1125 09:48:38.827068 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4\": container with ID starting with 7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4 not found: ID does not exist" containerID="7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.827118 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4"} err="failed to get container status \"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4\": rpc error: code = NotFound desc = could not find container \"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4\": container with ID starting with 7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4 not found: ID does not exist" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.827149 4565 scope.go:117] "RemoveContainer" containerID="46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" Nov 25 09:48:38 crc kubenswrapper[4565]: E1125 09:48:38.827694 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab\": container with ID starting with 46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab not found: ID does not exist" containerID="46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.827745 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab"} err="failed to get container status \"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab\": rpc error: code = NotFound desc = could not find container \"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab\": container with ID starting with 46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab not found: ID does not exist" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.827775 4565 scope.go:117] "RemoveContainer" containerID="7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.828132 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4"} err="failed to get container status \"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4\": rpc error: code = NotFound desc = could not find container \"7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4\": container with ID starting with 7dafaccc8a0704357c31ce3edb8c4bf91f688a189b6fa82a085764c22700c5f4 not found: ID does not exist" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.828171 4565 scope.go:117] "RemoveContainer" containerID="46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.828724 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab"} err="failed to get container status \"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab\": rpc error: code = NotFound desc = could not find container \"46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab\": container with ID starting with 46e60863c7b33de185a35c048a8239a27f32c362941a4aa5de733280a6c3bcab not found: ID does not exist" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.956442 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsxg9\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-kube-api-access-zsxg9\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.956518 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-scripts\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.956552 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-config-data\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.956858 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.957175 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-ceph\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.957319 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-logs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.957518 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.957564 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:38 crc kubenswrapper[4565]: I1125 09:48:38.957632 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.059810 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.059859 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.059902 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.059965 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxg9\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-kube-api-access-zsxg9\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.060011 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-scripts\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.060677 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.060674 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.061112 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-config-data\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.061237 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.061398 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-ceph\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.061456 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-logs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.062849 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ee609b-48dd-4e92-9b13-e9bf94768ead-logs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.070835 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-scripts\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.071076 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.076178 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-ceph\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.079795 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.082882 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsxg9\" (UniqueName: \"kubernetes.io/projected/e1ee609b-48dd-4e92-9b13-e9bf94768ead-kube-api-access-zsxg9\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.090747 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ee609b-48dd-4e92-9b13-e9bf94768ead-config-data\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.110736 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c58926f-d674-4eae-a37f-ffa9642e693a" path="/var/lib/kubelet/pods/4c58926f-d674-4eae-a37f-ffa9642e693a/volumes" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.152262 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e1ee609b-48dd-4e92-9b13-e9bf94768ead\") " pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.390202 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.710313 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"451d4f5d-1ecc-4633-a889-ea95473bc981","Type":"ContainerStarted","Data":"f346368e0657901e8902d70ed8d2ae3730b122b2355f303bb36fb59c807abc22"} Nov 25 09:48:39 crc kubenswrapper[4565]: I1125 09:48:39.736852 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.736837909 podStartE2EDuration="4.736837909s" podCreationTimestamp="2025-11-25 09:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:48:39.72721607 +0000 UTC m=+2652.929711208" watchObservedRunningTime="2025-11-25 09:48:39.736837909 +0000 UTC m=+2652.939333046" Nov 25 09:48:40 crc kubenswrapper[4565]: I1125 09:48:40.172076 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 09:48:40 crc kubenswrapper[4565]: I1125 09:48:40.784065 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 25 09:48:40 crc kubenswrapper[4565]: I1125 09:48:40.795876 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 25 09:48:45 crc kubenswrapper[4565]: W1125 09:48:45.243843 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ee609b_48dd_4e92_9b13_e9bf94768ead.slice/crio-08ccb331196aab4de9380616af8de7a234535b9fcf1793c5953d724afdb94365 WatchSource:0}: Error finding container 08ccb331196aab4de9380616af8de7a234535b9fcf1793c5953d724afdb94365: Status 404 returned error can't find the container with id 08ccb331196aab4de9380616af8de7a234535b9fcf1793c5953d724afdb94365 Nov 25 09:48:45 crc kubenswrapper[4565]: I1125 09:48:45.805048 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e1ee609b-48dd-4e92-9b13-e9bf94768ead","Type":"ContainerStarted","Data":"08ccb331196aab4de9380616af8de7a234535b9fcf1793c5953d724afdb94365"} Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.164725 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.164784 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.200757 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.205995 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.822242 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:46 crc kubenswrapper[4565]: I1125 09:48:46.822626 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:48 crc kubenswrapper[4565]: I1125 09:48:48.853376 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerStarted","Data":"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806"} Nov 25 09:48:48 crc kubenswrapper[4565]: I1125 09:48:48.857701 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerStarted","Data":"e58fe857475f1c2ec8bdb344f6d08e289a16dbdf058d1306864fa7ed5d4c3a0b"} Nov 25 09:48:48 crc kubenswrapper[4565]: I1125 09:48:48.858639 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549bb6ccb-qd7ll" event={"ID":"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb","Type":"ContainerStarted","Data":"89ecb1116381b0134d588c3b01ab75d95e79239e752f5129fad2d5dd53e6040d"} Nov 25 09:48:48 crc kubenswrapper[4565]: I1125 09:48:48.878779 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerStarted","Data":"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.521747 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.522151 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.587075 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.891551 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r5rpm" event={"ID":"b64f1fae-2be7-4ebd-a561-58884c10b4e6","Type":"ContainerStarted","Data":"fb5f5c3bda09faeafbea87526cb4f73e0fe6787730b9964d3940f0a4ee87791a"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.893493 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerStarted","Data":"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.893642 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67bbcd57bf-nszbm" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon-log" containerID="cri-o://34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" gracePeriod=30 Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.894015 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67bbcd57bf-nszbm" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon" containerID="cri-o://43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" gracePeriod=30 Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.897527 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerStarted","Data":"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.899796 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e1ee609b-48dd-4e92-9b13-e9bf94768ead","Type":"ContainerStarted","Data":"598020dc8e269ce478e798e5985d342516977e59823dac5b91acc050af73eb7e"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.899848 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e1ee609b-48dd-4e92-9b13-e9bf94768ead","Type":"ContainerStarted","Data":"e5eb6dcb7269e1b03ad45cf5576558209f3c680bab9dd1251c159f9fedf33bc6"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.903286 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerStarted","Data":"3591c5f5307d9cc5f2e73bed19564fa71b36948cdbcbf7fa45f1145c95fe891c"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.903425 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bbcb89d85-28rqw" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon-log" containerID="cri-o://e58fe857475f1c2ec8bdb344f6d08e289a16dbdf058d1306864fa7ed5d4c3a0b" gracePeriod=30 Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.903514 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bbcb89d85-28rqw" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon" containerID="cri-o://3591c5f5307d9cc5f2e73bed19564fa71b36948cdbcbf7fa45f1145c95fe891c" gracePeriod=30 Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.911769 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-r5rpm" podStartSLOduration=3.065335686 podStartE2EDuration="13.911757246s" podCreationTimestamp="2025-11-25 09:48:36 +0000 UTC" firstStartedPulling="2025-11-25 09:48:37.51412465 +0000 UTC m=+2650.716619788" lastFinishedPulling="2025-11-25 09:48:48.36054621 +0000 UTC m=+2661.563041348" observedRunningTime="2025-11-25 09:48:49.910398655 +0000 UTC m=+2663.112893793" watchObservedRunningTime="2025-11-25 09:48:49.911757246 +0000 UTC m=+2663.114252384" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.913356 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6549bb6ccb-qd7ll" event={"ID":"d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb","Type":"ContainerStarted","Data":"1fa1a1c0bcd3dcd44a588a6696129d11daa59a2e586984b0f96d8a58d72cacaf"} Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.937727 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5bbcb89d85-28rqw" podStartSLOduration=2.8124534 podStartE2EDuration="18.937716415s" podCreationTimestamp="2025-11-25 09:48:31 +0000 UTC" firstStartedPulling="2025-11-25 09:48:32.233455851 +0000 UTC m=+2645.435950979" lastFinishedPulling="2025-11-25 09:48:48.358718856 +0000 UTC m=+2661.561213994" observedRunningTime="2025-11-25 09:48:49.930260238 +0000 UTC m=+2663.132755376" watchObservedRunningTime="2025-11-25 09:48:49.937716415 +0000 UTC m=+2663.140211553" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.954872 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6848d4c5cd-8fv74" podStartSLOduration=3.978986847 podStartE2EDuration="16.954848273s" podCreationTimestamp="2025-11-25 09:48:33 +0000 UTC" firstStartedPulling="2025-11-25 09:48:35.381771252 +0000 UTC m=+2648.584266390" lastFinishedPulling="2025-11-25 09:48:48.357632678 +0000 UTC m=+2661.560127816" observedRunningTime="2025-11-25 09:48:49.951455888 +0000 UTC m=+2663.153951015" watchObservedRunningTime="2025-11-25 09:48:49.954848273 +0000 UTC m=+2663.157343401" Nov 25 09:48:49 crc kubenswrapper[4565]: I1125 09:48:49.982163 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.982144893 podStartE2EDuration="11.982144893s" podCreationTimestamp="2025-11-25 09:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:48:49.973947199 +0000 UTC m=+2663.176442327" watchObservedRunningTime="2025-11-25 09:48:49.982144893 +0000 UTC m=+2663.184640031" Nov 25 09:48:50 crc kubenswrapper[4565]: I1125 09:48:50.053132 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6549bb6ccb-qd7ll" podStartSLOduration=4.068849899 podStartE2EDuration="17.053118555s" podCreationTimestamp="2025-11-25 09:48:33 +0000 UTC" firstStartedPulling="2025-11-25 09:48:35.371992036 +0000 UTC m=+2648.574487174" lastFinishedPulling="2025-11-25 09:48:48.356260692 +0000 UTC m=+2661.558755830" observedRunningTime="2025-11-25 09:48:50.048175797 +0000 UTC m=+2663.250670936" watchObservedRunningTime="2025-11-25 09:48:50.053118555 +0000 UTC m=+2663.255613694" Nov 25 09:48:50 crc kubenswrapper[4565]: I1125 09:48:50.072061 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67bbcd57bf-nszbm" podStartSLOduration=3.049594939 podStartE2EDuration="19.072047612s" podCreationTimestamp="2025-11-25 09:48:31 +0000 UTC" firstStartedPulling="2025-11-25 09:48:32.336259189 +0000 UTC m=+2645.538754328" lastFinishedPulling="2025-11-25 09:48:48.358711863 +0000 UTC m=+2661.561207001" observedRunningTime="2025-11-25 09:48:50.020129113 +0000 UTC m=+2663.222624251" watchObservedRunningTime="2025-11-25 09:48:50.072047612 +0000 UTC m=+2663.274542750" Nov 25 09:48:51 crc kubenswrapper[4565]: I1125 09:48:51.466831 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:48:51 crc kubenswrapper[4565]: I1125 09:48:51.771488 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:48:54 crc kubenswrapper[4565]: I1125 09:48:54.362087 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:54 crc kubenswrapper[4565]: I1125 09:48:54.362443 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:48:54 crc kubenswrapper[4565]: I1125 09:48:54.449229 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:54 crc kubenswrapper[4565]: I1125 09:48:54.449285 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:48:55 crc kubenswrapper[4565]: I1125 09:48:55.099150 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:48:55 crc kubenswrapper[4565]: I1125 09:48:55.099202 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:48:55 crc kubenswrapper[4565]: I1125 09:48:55.984944 4565 generic.go:334] "Generic (PLEG): container finished" podID="b64f1fae-2be7-4ebd-a561-58884c10b4e6" containerID="fb5f5c3bda09faeafbea87526cb4f73e0fe6787730b9964d3940f0a4ee87791a" exitCode=0 Nov 25 09:48:55 crc kubenswrapper[4565]: I1125 09:48:55.984974 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r5rpm" event={"ID":"b64f1fae-2be7-4ebd-a561-58884c10b4e6","Type":"ContainerDied","Data":"fb5f5c3bda09faeafbea87526cb4f73e0fe6787730b9964d3940f0a4ee87791a"} Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.660126 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.755095 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k68d\" (UniqueName: \"kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d\") pod \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.755158 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle\") pod \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.755220 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data\") pod \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.755357 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data\") pod \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\" (UID: \"b64f1fae-2be7-4ebd-a561-58884c10b4e6\") " Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.785097 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d" (OuterVolumeSpecName: "kube-api-access-9k68d") pod "b64f1fae-2be7-4ebd-a561-58884c10b4e6" (UID: "b64f1fae-2be7-4ebd-a561-58884c10b4e6"). InnerVolumeSpecName "kube-api-access-9k68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.794795 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data" (OuterVolumeSpecName: "config-data") pod "b64f1fae-2be7-4ebd-a561-58884c10b4e6" (UID: "b64f1fae-2be7-4ebd-a561-58884c10b4e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.827392 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b64f1fae-2be7-4ebd-a561-58884c10b4e6" (UID: "b64f1fae-2be7-4ebd-a561-58884c10b4e6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.844738 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b64f1fae-2be7-4ebd-a561-58884c10b4e6" (UID: "b64f1fae-2be7-4ebd-a561-58884c10b4e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.859284 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.859321 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k68d\" (UniqueName: \"kubernetes.io/projected/b64f1fae-2be7-4ebd-a561-58884c10b4e6-kube-api-access-9k68d\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.859339 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:57 crc kubenswrapper[4565]: I1125 09:48:57.859353 4565 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b64f1fae-2be7-4ebd-a561-58884c10b4e6-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.012188 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r5rpm" event={"ID":"b64f1fae-2be7-4ebd-a561-58884c10b4e6","Type":"ContainerDied","Data":"87408143fc2381057d58ab1321cb8cf94927e2b9d16fb7de83de4e7fd3dd25c1"} Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.012354 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87408143fc2381057d58ab1321cb8cf94927e2b9d16fb7de83de4e7fd3dd25c1" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.012496 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r5rpm" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.571521 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: E1125 09:48:58.572485 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64f1fae-2be7-4ebd-a561-58884c10b4e6" containerName="manila-db-sync" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.572510 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64f1fae-2be7-4ebd-a561-58884c10b4e6" containerName="manila-db-sync" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.572889 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64f1fae-2be7-4ebd-a561-58884c10b4e6" containerName="manila-db-sync" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.580488 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.590424 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4s7rx" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.607085 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.609502 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.609947 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.679535 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.720147 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.721904 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.728444 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.732967 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.744770 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.744907 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.744959 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.745032 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.745064 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.745126 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.745149 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl7j\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.745176 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.746634 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f7c9fd6c5-k87xr"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.748154 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.760977 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7c9fd6c5-k87xr"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.780148 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.782061 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.785918 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.803086 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846677 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-dns-svc\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846733 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846761 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846780 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846811 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846827 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846866 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846892 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846905 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846938 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.846985 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847000 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847016 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl7j\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847038 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847071 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847104 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-config\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847128 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847157 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847195 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847214 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzp2d\" (UniqueName: \"kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847257 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-openstack-edpm-ipam\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847274 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847288 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847302 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-nb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847343 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmwm\" (UniqueName: \"kubernetes.io/projected/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-kube-api-access-pwmwm\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847364 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nw4\" (UniqueName: \"kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847382 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-sb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.847475 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.850153 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.860380 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.868302 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.870742 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.873818 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.875856 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl7j\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.889639 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data\") pod \"manila-share-share1-0\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.913065 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951257 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951298 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzp2d\" (UniqueName: \"kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951356 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-openstack-edpm-ipam\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951378 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951396 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-nb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951444 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmwm\" (UniqueName: \"kubernetes.io/projected/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-kube-api-access-pwmwm\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951470 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nw4\" (UniqueName: \"kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951487 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-sb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951516 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-dns-svc\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951535 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951554 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951571 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951631 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951661 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951728 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951782 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951821 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-config\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951849 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.951887 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.955101 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-dns-svc\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.956741 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-nb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.958133 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-config\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.958299 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.958990 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-openstack-edpm-ipam\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.959291 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.959643 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.969458 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-ovsdbserver-sb\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.969883 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.970695 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.972529 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.978423 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.978676 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.981323 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzp2d\" (UniqueName: \"kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.985595 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.987283 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " pod="openstack/manila-scheduler-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.988349 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nw4\" (UniqueName: \"kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.995505 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmwm\" (UniqueName: \"kubernetes.io/projected/15b78aa8-e609-4509-bc5e-4ec6fa67dd57-kube-api-access-pwmwm\") pod \"dnsmasq-dns-f7c9fd6c5-k87xr\" (UID: \"15b78aa8-e609-4509-bc5e-4ec6fa67dd57\") " pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:58 crc kubenswrapper[4565]: I1125 09:48:58.995773 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom\") pod \"manila-api-0\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " pod="openstack/manila-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.054534 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.072122 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.126421 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.391239 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.391504 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.508300 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.526862 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.843804 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:48:59 crc kubenswrapper[4565]: W1125 09:48:59.900072 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca2257b_f543_4f54_a433_dc3d852b6f59.slice/crio-93f7a3c45673ec6737a2de24cd328e7a06f1e277f30d265b6c0bb6ea1afd9853 WatchSource:0}: Error finding container 93f7a3c45673ec6737a2de24cd328e7a06f1e277f30d265b6c0bb6ea1afd9853: Status 404 returned error can't find the container with id 93f7a3c45673ec6737a2de24cd328e7a06f1e277f30d265b6c0bb6ea1afd9853 Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.940233 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:48:59 crc kubenswrapper[4565]: I1125 09:48:59.957625 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7c9fd6c5-k87xr"] Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.089588 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerStarted","Data":"291e4716c6c5def04cced4a6c6a666d796190709d7dfe0d3c291243a80c9d453"} Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.106778 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.125230 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" event={"ID":"15b78aa8-e609-4509-bc5e-4ec6fa67dd57","Type":"ContainerStarted","Data":"42237b832f5c3f9947d9c976bbd33ef33a69f458c996617ea744b2fae5fbd6c2"} Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.133043 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerStarted","Data":"93f7a3c45673ec6737a2de24cd328e7a06f1e277f30d265b6c0bb6ea1afd9853"} Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.133761 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 09:49:00 crc kubenswrapper[4565]: I1125 09:49:00.133788 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 09:49:01 crc kubenswrapper[4565]: I1125 09:49:01.162260 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerStarted","Data":"358cb026ebaf03c544608c50ed2b74e7a55f19fc71082300f4908fc80bb5c141"} Nov 25 09:49:01 crc kubenswrapper[4565]: I1125 09:49:01.162736 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerStarted","Data":"78cf2312e29a461041f98402fd58fc68b80b80ab642e8cecd980b299c98aa1eb"} Nov 25 09:49:01 crc kubenswrapper[4565]: I1125 09:49:01.179085 4565 generic.go:334] "Generic (PLEG): container finished" podID="15b78aa8-e609-4509-bc5e-4ec6fa67dd57" containerID="27e97fb9d4869199b6f2a363c72a7edb81a2f6ebfefe1f3e06217b740954b6c2" exitCode=0 Nov 25 09:49:01 crc kubenswrapper[4565]: I1125 09:49:01.180330 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" event={"ID":"15b78aa8-e609-4509-bc5e-4ec6fa67dd57","Type":"ContainerDied","Data":"27e97fb9d4869199b6f2a363c72a7edb81a2f6ebfefe1f3e06217b740954b6c2"} Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.210036 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerStarted","Data":"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc"} Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.220992 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" event={"ID":"15b78aa8-e609-4509-bc5e-4ec6fa67dd57","Type":"ContainerStarted","Data":"a8c1f67559f39a819109008d8d314798966d94ac69323e34dd782104f6ab14f6"} Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.222482 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.231633 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.231660 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.233242 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerStarted","Data":"ca1b7100406b5d2cff9cd789e2a155798d78f33fd442d43d1dcbdf3658d5e950"} Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.233282 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.298473 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" podStartSLOduration=4.298438235 podStartE2EDuration="4.298438235s" podCreationTimestamp="2025-11-25 09:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:49:02.294916797 +0000 UTC m=+2675.497411935" watchObservedRunningTime="2025-11-25 09:49:02.298438235 +0000 UTC m=+2675.500933373" Nov 25 09:49:02 crc kubenswrapper[4565]: I1125 09:49:02.355642 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.355621088 podStartE2EDuration="4.355621088s" podCreationTimestamp="2025-11-25 09:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:49:02.321369536 +0000 UTC m=+2675.523864674" watchObservedRunningTime="2025-11-25 09:49:02.355621088 +0000 UTC m=+2675.558116226" Nov 25 09:49:03 crc kubenswrapper[4565]: I1125 09:49:03.245255 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerStarted","Data":"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3"} Nov 25 09:49:03 crc kubenswrapper[4565]: I1125 09:49:03.273464 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.152907495 podStartE2EDuration="5.2734466s" podCreationTimestamp="2025-11-25 09:48:58 +0000 UTC" firstStartedPulling="2025-11-25 09:48:59.96830838 +0000 UTC m=+2673.170803519" lastFinishedPulling="2025-11-25 09:49:01.088847486 +0000 UTC m=+2674.291342624" observedRunningTime="2025-11-25 09:49:03.268312722 +0000 UTC m=+2676.470807860" watchObservedRunningTime="2025-11-25 09:49:03.2734466 +0000 UTC m=+2676.475941728" Nov 25 09:49:04 crc kubenswrapper[4565]: I1125 09:49:04.037478 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:04 crc kubenswrapper[4565]: I1125 09:49:04.255079 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api-log" containerID="cri-o://358cb026ebaf03c544608c50ed2b74e7a55f19fc71082300f4908fc80bb5c141" gracePeriod=30 Nov 25 09:49:04 crc kubenswrapper[4565]: I1125 09:49:04.255182 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api" containerID="cri-o://ca1b7100406b5d2cff9cd789e2a155798d78f33fd442d43d1dcbdf3658d5e950" gracePeriod=30 Nov 25 09:49:04 crc kubenswrapper[4565]: I1125 09:49:04.363407 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Nov 25 09:49:04 crc kubenswrapper[4565]: I1125 09:49:04.450745 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6549bb6ccb-qd7ll" podUID="d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.308941 4565 generic.go:334] "Generic (PLEG): container finished" podID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerID="ca1b7100406b5d2cff9cd789e2a155798d78f33fd442d43d1dcbdf3658d5e950" exitCode=0 Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.309198 4565 generic.go:334] "Generic (PLEG): container finished" podID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerID="358cb026ebaf03c544608c50ed2b74e7a55f19fc71082300f4908fc80bb5c141" exitCode=143 Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.309223 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerDied","Data":"ca1b7100406b5d2cff9cd789e2a155798d78f33fd442d43d1dcbdf3658d5e950"} Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.309255 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerDied","Data":"358cb026ebaf03c544608c50ed2b74e7a55f19fc71082300f4908fc80bb5c141"} Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.467242 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578535 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578697 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4nw4\" (UniqueName: \"kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578730 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578779 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578847 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578895 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.579550 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom\") pod \"be276b7b-2c96-4658-abd6-415f85f99b8c\" (UID: \"be276b7b-2c96-4658-abd6-415f85f99b8c\") " Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.578979 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.579179 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs" (OuterVolumeSpecName: "logs") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.609075 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.625057 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts" (OuterVolumeSpecName: "scripts") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.625272 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4" (OuterVolumeSpecName: "kube-api-access-v4nw4") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "kube-api-access-v4nw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.682877 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be276b7b-2c96-4658-abd6-415f85f99b8c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.682910 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.682920 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.682942 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4nw4\" (UniqueName: \"kubernetes.io/projected/be276b7b-2c96-4658-abd6-415f85f99b8c-kube-api-access-v4nw4\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.682956 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be276b7b-2c96-4658-abd6-415f85f99b8c-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.694401 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data" (OuterVolumeSpecName: "config-data") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.711728 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be276b7b-2c96-4658-abd6-415f85f99b8c" (UID: "be276b7b-2c96-4658-abd6-415f85f99b8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.785992 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.786019 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be276b7b-2c96-4658-abd6-415f85f99b8c-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.949376 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.949506 4565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 09:49:05 crc kubenswrapper[4565]: I1125 09:49:05.952118 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.318529 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"be276b7b-2c96-4658-abd6-415f85f99b8c","Type":"ContainerDied","Data":"78cf2312e29a461041f98402fd58fc68b80b80ab642e8cecd980b299c98aa1eb"} Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.318541 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.318867 4565 scope.go:117] "RemoveContainer" containerID="ca1b7100406b5d2cff9cd789e2a155798d78f33fd442d43d1dcbdf3658d5e950" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.363633 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.363784 4565 scope.go:117] "RemoveContainer" containerID="358cb026ebaf03c544608c50ed2b74e7a55f19fc71082300f4908fc80bb5c141" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.379275 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.390851 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:06 crc kubenswrapper[4565]: E1125 09:49:06.391203 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.391224 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api" Nov 25 09:49:06 crc kubenswrapper[4565]: E1125 09:49:06.391237 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api-log" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.391244 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api-log" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.391434 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.391462 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" containerName="manila-api-log" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.392340 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.398112 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.398206 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.402952 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.409785 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.508953 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j785l\" (UniqueName: \"kubernetes.io/projected/dbdf37e4-2636-44d8-981a-4c960e37799e-kube-api-access-j785l\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.508993 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data-custom\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509022 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbdf37e4-2636-44d8-981a-4c960e37799e-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509046 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509074 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-scripts\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509125 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbdf37e4-2636-44d8-981a-4c960e37799e-logs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509194 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509251 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.509273 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620499 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620609 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620646 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620696 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j785l\" (UniqueName: \"kubernetes.io/projected/dbdf37e4-2636-44d8-981a-4c960e37799e-kube-api-access-j785l\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620717 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data-custom\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620756 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbdf37e4-2636-44d8-981a-4c960e37799e-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620783 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620813 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-scripts\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.620909 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbdf37e4-2636-44d8-981a-4c960e37799e-logs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.622826 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbdf37e4-2636-44d8-981a-4c960e37799e-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.630895 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.632813 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbdf37e4-2636-44d8-981a-4c960e37799e-logs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.634423 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-scripts\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.635068 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-config-data-custom\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.647011 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.648304 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.650909 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdf37e4-2636-44d8-981a-4c960e37799e-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.677664 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j785l\" (UniqueName: \"kubernetes.io/projected/dbdf37e4-2636-44d8-981a-4c960e37799e-kube-api-access-j785l\") pod \"manila-api-0\" (UID: \"dbdf37e4-2636-44d8-981a-4c960e37799e\") " pod="openstack/manila-api-0" Nov 25 09:49:06 crc kubenswrapper[4565]: I1125 09:49:06.722765 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 25 09:49:07 crc kubenswrapper[4565]: I1125 09:49:07.119643 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be276b7b-2c96-4658-abd6-415f85f99b8c" path="/var/lib/kubelet/pods/be276b7b-2c96-4658-abd6-415f85f99b8c/volumes" Nov 25 09:49:07 crc kubenswrapper[4565]: I1125 09:49:07.390502 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 25 09:49:07 crc kubenswrapper[4565]: W1125 09:49:07.405195 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbdf37e4_2636_44d8_981a_4c960e37799e.slice/crio-ae1c7a930baa31fdf10c91f44e0cee2c8c1abaeda142184538b79616c5bedd8b WatchSource:0}: Error finding container ae1c7a930baa31fdf10c91f44e0cee2c8c1abaeda142184538b79616c5bedd8b: Status 404 returned error can't find the container with id ae1c7a930baa31fdf10c91f44e0cee2c8c1abaeda142184538b79616c5bedd8b Nov 25 09:49:08 crc kubenswrapper[4565]: I1125 09:49:08.357443 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbdf37e4-2636-44d8-981a-4c960e37799e","Type":"ContainerStarted","Data":"88f1cc45c0b9f3da7d88a0fee6fa3d4f4b1f98999ef45953f8cd3964a08975da"} Nov 25 09:49:08 crc kubenswrapper[4565]: I1125 09:49:08.357975 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbdf37e4-2636-44d8-981a-4c960e37799e","Type":"ContainerStarted","Data":"ae1c7a930baa31fdf10c91f44e0cee2c8c1abaeda142184538b79616c5bedd8b"} Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.055059 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.075048 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f7c9fd6c5-k87xr" Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.163436 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.163891 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="dnsmasq-dns" containerID="cri-o://7237006e54da0b2b5355ea89f0f6624c100e6576d7d63fbac41bdccc12c3ac22" gracePeriod=10 Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.387324 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbdf37e4-2636-44d8-981a-4c960e37799e","Type":"ContainerStarted","Data":"97659e3195c82484ce6933b8585a16191cee3a2e2ce803fec76bc9e53af33165"} Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.388708 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.394366 4565 generic.go:334] "Generic (PLEG): container finished" podID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerID="7237006e54da0b2b5355ea89f0f6624c100e6576d7d63fbac41bdccc12c3ac22" exitCode=0 Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.394404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" event={"ID":"a957b81e-1acc-4e1c-be9b-0c5be361ebb3","Type":"ContainerDied","Data":"7237006e54da0b2b5355ea89f0f6624c100e6576d7d63fbac41bdccc12c3ac22"} Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.421272 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.421260857 podStartE2EDuration="3.421260857s" podCreationTimestamp="2025-11-25 09:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:49:09.412076583 +0000 UTC m=+2682.614571720" watchObservedRunningTime="2025-11-25 09:49:09.421260857 +0000 UTC m=+2682.623755994" Nov 25 09:49:09 crc kubenswrapper[4565]: I1125 09:49:09.958765 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.054494 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.055410 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfvcc\" (UniqueName: \"kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.055646 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.056536 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.056727 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.056859 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc\") pod \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\" (UID: \"a957b81e-1acc-4e1c-be9b-0c5be361ebb3\") " Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.079058 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc" (OuterVolumeSpecName: "kube-api-access-cfvcc") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "kube-api-access-cfvcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.142488 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.154190 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config" (OuterVolumeSpecName: "config") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.161611 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfvcc\" (UniqueName: \"kubernetes.io/projected/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-kube-api-access-cfvcc\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.161641 4565 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-config\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.161654 4565 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.196335 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.204301 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.207632 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a957b81e-1acc-4e1c-be9b-0c5be361ebb3" (UID: "a957b81e-1acc-4e1c-be9b-0c5be361ebb3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.263116 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.263143 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.263154 4565 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a957b81e-1acc-4e1c-be9b-0c5be361ebb3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.422580 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.423005 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858f54d499-ngjgw" event={"ID":"a957b81e-1acc-4e1c-be9b-0c5be361ebb3","Type":"ContainerDied","Data":"32c3237b462740ab9c03801a11150b3697e1f2d9542189744fad4aabf1ced20a"} Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.423107 4565 scope.go:117] "RemoveContainer" containerID="7237006e54da0b2b5355ea89f0f6624c100e6576d7d63fbac41bdccc12c3ac22" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.471100 4565 scope.go:117] "RemoveContainer" containerID="cd631c5374a4dfbeb7aceaeb7997ae23a826c021fbecee1fb56f2511fd59603c" Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.479040 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:49:10 crc kubenswrapper[4565]: I1125 09:49:10.490498 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-858f54d499-ngjgw"] Nov 25 09:49:11 crc kubenswrapper[4565]: I1125 09:49:11.113545 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" path="/var/lib/kubelet/pods/a957b81e-1acc-4e1c-be9b-0c5be361ebb3/volumes" Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.249264 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.249499 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-central-agent" containerID="cri-o://dca4219f4122ad019aeb2b0488da9b72fd339aae9dfa8951172c224aa8d24909" gracePeriod=30 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.249609 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="proxy-httpd" containerID="cri-o://ae2e8815c54b843e87677bd998363eefad45104c69084dfe3b9fce2408251cd4" gracePeriod=30 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.249644 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="sg-core" containerID="cri-o://b5b5ad7ce46bf4df020b0a3a8b9731b0be26aae3115d73b31cd301496b8bc65f" gracePeriod=30 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.249677 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-notification-agent" containerID="cri-o://6d211cf1c58730f51191a09c61542a9d6686ddcb2b9b063bcf46b4e2fd511248" gracePeriod=30 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.361944 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.449538 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6549bb6ccb-qd7ll" podUID="d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.487192 4565 generic.go:334] "Generic (PLEG): container finished" podID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerID="ae2e8815c54b843e87677bd998363eefad45104c69084dfe3b9fce2408251cd4" exitCode=0 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.487230 4565 generic.go:334] "Generic (PLEG): container finished" podID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerID="b5b5ad7ce46bf4df020b0a3a8b9731b0be26aae3115d73b31cd301496b8bc65f" exitCode=2 Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.487255 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerDied","Data":"ae2e8815c54b843e87677bd998363eefad45104c69084dfe3b9fce2408251cd4"} Nov 25 09:49:14 crc kubenswrapper[4565]: I1125 09:49:14.487283 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerDied","Data":"b5b5ad7ce46bf4df020b0a3a8b9731b0be26aae3115d73b31cd301496b8bc65f"} Nov 25 09:49:15 crc kubenswrapper[4565]: I1125 09:49:15.506420 4565 generic.go:334] "Generic (PLEG): container finished" podID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerID="6d211cf1c58730f51191a09c61542a9d6686ddcb2b9b063bcf46b4e2fd511248" exitCode=0 Nov 25 09:49:15 crc kubenswrapper[4565]: I1125 09:49:15.507656 4565 generic.go:334] "Generic (PLEG): container finished" podID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerID="dca4219f4122ad019aeb2b0488da9b72fd339aae9dfa8951172c224aa8d24909" exitCode=0 Nov 25 09:49:15 crc kubenswrapper[4565]: I1125 09:49:15.506669 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerDied","Data":"6d211cf1c58730f51191a09c61542a9d6686ddcb2b9b063bcf46b4e2fd511248"} Nov 25 09:49:15 crc kubenswrapper[4565]: I1125 09:49:15.507754 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerDied","Data":"dca4219f4122ad019aeb2b0488da9b72fd339aae9dfa8951172c224aa8d24909"} Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.791394 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967027 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967191 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967433 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967552 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967712 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.967895 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.968001 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.968098 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766tm\" (UniqueName: \"kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm\") pod \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\" (UID: \"e8c96668-a926-4c72-8d14-4a7fa95b89cd\") " Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.971046 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.971178 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:16 crc kubenswrapper[4565]: I1125 09:49:16.999042 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm" (OuterVolumeSpecName: "kube-api-access-766tm") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "kube-api-access-766tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.019493 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts" (OuterVolumeSpecName: "scripts") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.071096 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.071127 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.071141 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766tm\" (UniqueName: \"kubernetes.io/projected/e8c96668-a926-4c72-8d14-4a7fa95b89cd-kube-api-access-766tm\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.071153 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c96668-a926-4c72-8d14-4a7fa95b89cd-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.076163 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.154743 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.171334 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.172299 4565 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.172319 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.172328 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.189432 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data" (OuterVolumeSpecName: "config-data") pod "e8c96668-a926-4c72-8d14-4a7fa95b89cd" (UID: "e8c96668-a926-4c72-8d14-4a7fa95b89cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.273269 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c96668-a926-4c72-8d14-4a7fa95b89cd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.546536 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8c96668-a926-4c72-8d14-4a7fa95b89cd","Type":"ContainerDied","Data":"4ca76214db06ab128ac41fb62f1c98736f2be39f46f9b15c2548e257f8317e8d"} Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.546956 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.547060 4565 scope.go:117] "RemoveContainer" containerID="ae2e8815c54b843e87677bd998363eefad45104c69084dfe3b9fce2408251cd4" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.560515 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerStarted","Data":"bc84a9f864de84083c43ec971810c5726d5e76174c8603a8355e2fa7acde3f49"} Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.583050 4565 scope.go:117] "RemoveContainer" containerID="b5b5ad7ce46bf4df020b0a3a8b9731b0be26aae3115d73b31cd301496b8bc65f" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.592980 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.602557 4565 scope.go:117] "RemoveContainer" containerID="6d211cf1c58730f51191a09c61542a9d6686ddcb2b9b063bcf46b4e2fd511248" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.604313 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.625991 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.626502 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-notification-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.626566 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-notification-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.626628 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="sg-core" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.626675 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="sg-core" Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.626733 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="proxy-httpd" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.626776 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="proxy-httpd" Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.626834 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-central-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.626878 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-central-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.626965 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="init" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627027 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="init" Nov 25 09:49:17 crc kubenswrapper[4565]: E1125 09:49:17.627105 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="dnsmasq-dns" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627153 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="dnsmasq-dns" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627356 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="proxy-httpd" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627417 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="sg-core" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627475 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-central-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627522 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a957b81e-1acc-4e1c-be9b-0c5be361ebb3" containerName="dnsmasq-dns" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.627570 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" containerName="ceilometer-notification-agent" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.629462 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.632715 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.632945 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.660967 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.663521 4565 scope.go:117] "RemoveContainer" containerID="dca4219f4122ad019aeb2b0488da9b72fd339aae9dfa8951172c224aa8d24909" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.669073 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.783912 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784032 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784102 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784178 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784266 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784419 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784500 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.784572 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb8nr\" (UniqueName: \"kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887337 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb8nr\" (UniqueName: \"kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887387 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887460 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887515 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887575 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887647 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887792 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.887870 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.891496 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.892196 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.896322 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.896380 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.897522 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.899829 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.933542 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb8nr\" (UniqueName: \"kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:17 crc kubenswrapper[4565]: I1125 09:49:17.933546 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts\") pod \"ceilometer-0\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " pod="openstack/ceilometer-0" Nov 25 09:49:18 crc kubenswrapper[4565]: I1125 09:49:18.008365 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:18 crc kubenswrapper[4565]: I1125 09:49:18.572468 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerStarted","Data":"f30cc3c7597fac6264afd18deacc583a2754c3f4e3f24458cbdb41f88c4b6eed"} Nov 25 09:49:18 crc kubenswrapper[4565]: I1125 09:49:18.608073 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.102835937 podStartE2EDuration="20.608055833s" podCreationTimestamp="2025-11-25 09:48:58 +0000 UTC" firstStartedPulling="2025-11-25 09:48:59.929165446 +0000 UTC m=+2673.131660574" lastFinishedPulling="2025-11-25 09:49:16.434385332 +0000 UTC m=+2689.636880470" observedRunningTime="2025-11-25 09:49:18.592457716 +0000 UTC m=+2691.794952854" watchObservedRunningTime="2025-11-25 09:49:18.608055833 +0000 UTC m=+2691.810550991" Nov 25 09:49:18 crc kubenswrapper[4565]: I1125 09:49:18.642513 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:18 crc kubenswrapper[4565]: I1125 09:49:18.914645 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 25 09:49:19 crc kubenswrapper[4565]: I1125 09:49:19.107104 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c96668-a926-4c72-8d14-4a7fa95b89cd" path="/var/lib/kubelet/pods/e8c96668-a926-4c72-8d14-4a7fa95b89cd/volumes" Nov 25 09:49:19 crc kubenswrapper[4565]: I1125 09:49:19.583272 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerStarted","Data":"3943420a92278ad409a1aeafd333ec0d3fad6752f6c8fdbf8784afac2143befc"} Nov 25 09:49:19 crc kubenswrapper[4565]: I1125 09:49:19.583311 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerStarted","Data":"92e60f2ef82205984c18d3d660b187d1fd8032e7714d8f000f059f4eb17a1894"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.354337 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.450130 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs\") pod \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.450313 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4hds\" (UniqueName: \"kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds\") pod \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.450494 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key\") pod \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.450553 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data\") pod \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.450644 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts\") pod \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\" (UID: \"f6405915-6a04-4cdb-b837-6f12e31bb7bc\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.451035 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs" (OuterVolumeSpecName: "logs") pod "f6405915-6a04-4cdb-b837-6f12e31bb7bc" (UID: "f6405915-6a04-4cdb-b837-6f12e31bb7bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.451671 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6405915-6a04-4cdb-b837-6f12e31bb7bc-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.455850 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f6405915-6a04-4cdb-b837-6f12e31bb7bc" (UID: "f6405915-6a04-4cdb-b837-6f12e31bb7bc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.459701 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds" (OuterVolumeSpecName: "kube-api-access-n4hds") pod "f6405915-6a04-4cdb-b837-6f12e31bb7bc" (UID: "f6405915-6a04-4cdb-b837-6f12e31bb7bc"). InnerVolumeSpecName "kube-api-access-n4hds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.477698 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts" (OuterVolumeSpecName: "scripts") pod "f6405915-6a04-4cdb-b837-6f12e31bb7bc" (UID: "f6405915-6a04-4cdb-b837-6f12e31bb7bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.480225 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data" (OuterVolumeSpecName: "config-data") pod "f6405915-6a04-4cdb-b837-6f12e31bb7bc" (UID: "f6405915-6a04-4cdb-b837-6f12e31bb7bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.556753 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4hds\" (UniqueName: \"kubernetes.io/projected/f6405915-6a04-4cdb-b837-6f12e31bb7bc-kube-api-access-n4hds\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.556795 4565 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6405915-6a04-4cdb-b837-6f12e31bb7bc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.556806 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.556816 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6405915-6a04-4cdb-b837-6f12e31bb7bc-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.601900 4565 generic.go:334] "Generic (PLEG): container finished" podID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerID="43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" exitCode=137 Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.601945 4565 generic.go:334] "Generic (PLEG): container finished" podID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerID="34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" exitCode=137 Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.601991 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerDied","Data":"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.602019 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerDied","Data":"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.602031 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bbcd57bf-nszbm" event={"ID":"f6405915-6a04-4cdb-b837-6f12e31bb7bc","Type":"ContainerDied","Data":"c04d2084cd8ae19fa33f7a3e724cfd9a7dbfe411f05e5a1067da35fe4515fe2b"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.602046 4565 scope.go:117] "RemoveContainer" containerID="43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.602184 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bbcd57bf-nszbm" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.615057 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerStarted","Data":"3c9c8d70e1328ddd506d9962728090d1d03f37031dc47d277d468f272e16d180"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.623212 4565 generic.go:334] "Generic (PLEG): container finished" podID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerID="3591c5f5307d9cc5f2e73bed19564fa71b36948cdbcbf7fa45f1145c95fe891c" exitCode=137 Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.623238 4565 generic.go:334] "Generic (PLEG): container finished" podID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerID="e58fe857475f1c2ec8bdb344f6d08e289a16dbdf058d1306864fa7ed5d4c3a0b" exitCode=137 Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.624463 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerDied","Data":"3591c5f5307d9cc5f2e73bed19564fa71b36948cdbcbf7fa45f1145c95fe891c"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.624494 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerDied","Data":"e58fe857475f1c2ec8bdb344f6d08e289a16dbdf058d1306864fa7ed5d4c3a0b"} Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.695002 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.704889 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67bbcd57bf-nszbm"] Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.824341 4565 scope.go:117] "RemoveContainer" containerID="34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.846946 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.858154 4565 scope.go:117] "RemoveContainer" containerID="43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" Nov 25 09:49:20 crc kubenswrapper[4565]: E1125 09:49:20.882613 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c\": container with ID starting with 43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c not found: ID does not exist" containerID="43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.882648 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c"} err="failed to get container status \"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c\": rpc error: code = NotFound desc = could not find container \"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c\": container with ID starting with 43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c not found: ID does not exist" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.882676 4565 scope.go:117] "RemoveContainer" containerID="34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" Nov 25 09:49:20 crc kubenswrapper[4565]: E1125 09:49:20.887413 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1\": container with ID starting with 34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1 not found: ID does not exist" containerID="34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.887440 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1"} err="failed to get container status \"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1\": rpc error: code = NotFound desc = could not find container \"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1\": container with ID starting with 34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1 not found: ID does not exist" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.887456 4565 scope.go:117] "RemoveContainer" containerID="43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.891491 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c"} err="failed to get container status \"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c\": rpc error: code = NotFound desc = could not find container \"43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c\": container with ID starting with 43d6a1dbbfb0875c9c0b7390c845a99021fa9f8fc878d90d7f24745fbfb9ad5c not found: ID does not exist" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.891515 4565 scope.go:117] "RemoveContainer" containerID="34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.892575 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1"} err="failed to get container status \"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1\": rpc error: code = NotFound desc = could not find container \"34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1\": container with ID starting with 34c3ebad7ac7a403722ac7a691996ca6a7f45a867443ce361ce34bf562050ea1 not found: ID does not exist" Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.986587 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key\") pod \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.986648 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs\") pod \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.986692 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts\") pod \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.986726 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data\") pod \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.986777 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhv8\" (UniqueName: \"kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8\") pod \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\" (UID: \"bd8b0555-5ea6-4fae-b234-7f7661406dd1\") " Nov 25 09:49:20 crc kubenswrapper[4565]: I1125 09:49:20.987386 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs" (OuterVolumeSpecName: "logs") pod "bd8b0555-5ea6-4fae-b234-7f7661406dd1" (UID: "bd8b0555-5ea6-4fae-b234-7f7661406dd1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:20.999233 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bd8b0555-5ea6-4fae-b234-7f7661406dd1" (UID: "bd8b0555-5ea6-4fae-b234-7f7661406dd1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.054548 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8" (OuterVolumeSpecName: "kube-api-access-sdhv8") pod "bd8b0555-5ea6-4fae-b234-7f7661406dd1" (UID: "bd8b0555-5ea6-4fae-b234-7f7661406dd1"). InnerVolumeSpecName "kube-api-access-sdhv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.059497 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts" (OuterVolumeSpecName: "scripts") pod "bd8b0555-5ea6-4fae-b234-7f7661406dd1" (UID: "bd8b0555-5ea6-4fae-b234-7f7661406dd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.074542 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data" (OuterVolumeSpecName: "config-data") pod "bd8b0555-5ea6-4fae-b234-7f7661406dd1" (UID: "bd8b0555-5ea6-4fae-b234-7f7661406dd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.090989 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.091023 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhv8\" (UniqueName: \"kubernetes.io/projected/bd8b0555-5ea6-4fae-b234-7f7661406dd1-kube-api-access-sdhv8\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.091037 4565 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8b0555-5ea6-4fae-b234-7f7661406dd1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.091046 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8b0555-5ea6-4fae-b234-7f7661406dd1-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.091054 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b0555-5ea6-4fae-b234-7f7661406dd1-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.113288 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" path="/var/lib/kubelet/pods/f6405915-6a04-4cdb-b837-6f12e31bb7bc/volumes" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223142 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:21 crc kubenswrapper[4565]: E1125 09:49:21.223580 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223608 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: E1125 09:49:21.223638 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223645 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: E1125 09:49:21.223660 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223666 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: E1125 09:49:21.223681 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223689 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223917 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223965 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon-log" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.223987 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.224002 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6405915-6a04-4cdb-b837-6f12e31bb7bc" containerName="horizon" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.226489 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.236103 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.279307 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.295495 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.295611 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.295738 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jx26\" (UniqueName: \"kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.324596 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.398056 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.398188 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jx26\" (UniqueName: \"kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.398495 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.399025 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.399484 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.420805 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jx26\" (UniqueName: \"kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26\") pod \"community-operators-wgp4j\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.540837 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.655022 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerStarted","Data":"f0c9375d2c051c77dda2c1c7677e4bfbdeb5c67e7668a75985d5cc5d20c1ab58"} Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.669901 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="manila-scheduler" containerID="cri-o://ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc" gracePeriod=30 Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.670227 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bbcb89d85-28rqw" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.671212 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bbcb89d85-28rqw" event={"ID":"bd8b0555-5ea6-4fae-b234-7f7661406dd1","Type":"ContainerDied","Data":"5d0224d113bfffd0e85692de07c0f00e962764f95332a92bd03c4239ce0d852b"} Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.671246 4565 scope.go:117] "RemoveContainer" containerID="3591c5f5307d9cc5f2e73bed19564fa71b36948cdbcbf7fa45f1145c95fe891c" Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.671546 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="probe" containerID="cri-o://8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3" gracePeriod=30 Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.699121 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.715451 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bbcb89d85-28rqw"] Nov 25 09:49:21 crc kubenswrapper[4565]: I1125 09:49:21.863132 4565 scope.go:117] "RemoveContainer" containerID="e58fe857475f1c2ec8bdb344f6d08e289a16dbdf058d1306864fa7ed5d4c3a0b" Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.142540 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.691643 4565 generic.go:334] "Generic (PLEG): container finished" podID="086c74f4-2c43-43a5-953d-0510a030706a" containerID="34651906671a225f5cd738dca6298664bf395d6b408885ec69f16e38c608625a" exitCode=0 Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.692229 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerDied","Data":"34651906671a225f5cd738dca6298664bf395d6b408885ec69f16e38c608625a"} Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.692337 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerStarted","Data":"301c7b2491856978feb2ba288444508284e4020aed831d71b35495a720e22ad7"} Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.694905 4565 generic.go:334] "Generic (PLEG): container finished" podID="acd7bff4-af86-442a-b108-dfa46e230085" containerID="8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3" exitCode=0 Nov 25 09:49:22 crc kubenswrapper[4565]: I1125 09:49:22.694998 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerDied","Data":"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3"} Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.114600 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8b0555-5ea6-4fae-b234-7f7661406dd1" path="/var/lib/kubelet/pods/bd8b0555-5ea6-4fae-b234-7f7661406dd1/volumes" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.154443 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.246578 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.246734 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzp2d\" (UniqueName: \"kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.246808 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.246891 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.246947 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.247000 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data\") pod \"acd7bff4-af86-442a-b108-dfa46e230085\" (UID: \"acd7bff4-af86-442a-b108-dfa46e230085\") " Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.247627 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.247971 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acd7bff4-af86-442a-b108-dfa46e230085-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.259202 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts" (OuterVolumeSpecName: "scripts") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.259234 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d" (OuterVolumeSpecName: "kube-api-access-wzp2d") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "kube-api-access-wzp2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.259307 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.306280 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.349354 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.349385 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzp2d\" (UniqueName: \"kubernetes.io/projected/acd7bff4-af86-442a-b108-dfa46e230085-kube-api-access-wzp2d\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.349398 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.349407 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.353882 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data" (OuterVolumeSpecName: "config-data") pod "acd7bff4-af86-442a-b108-dfa46e230085" (UID: "acd7bff4-af86-442a-b108-dfa46e230085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.452243 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd7bff4-af86-442a-b108-dfa46e230085-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.711961 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerStarted","Data":"cf75c0f92f5215eb1e997c5ef18158006fd775c606bf1111db98102878a25fdb"} Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.712101 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.715187 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerStarted","Data":"c87877e6c0907db46d4c90187fceb9facddccdea0b1f4443b811e646f09d59ae"} Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.717678 4565 generic.go:334] "Generic (PLEG): container finished" podID="acd7bff4-af86-442a-b108-dfa46e230085" containerID="ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc" exitCode=0 Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.717722 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerDied","Data":"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc"} Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.717766 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"acd7bff4-af86-442a-b108-dfa46e230085","Type":"ContainerDied","Data":"291e4716c6c5def04cced4a6c6a666d796190709d7dfe0d3c291243a80c9d453"} Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.717794 4565 scope.go:117] "RemoveContainer" containerID="8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.717732 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.746582 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.735516447 podStartE2EDuration="6.746568083s" podCreationTimestamp="2025-11-25 09:49:17 +0000 UTC" firstStartedPulling="2025-11-25 09:49:18.631313599 +0000 UTC m=+2691.833808737" lastFinishedPulling="2025-11-25 09:49:22.642365235 +0000 UTC m=+2695.844860373" observedRunningTime="2025-11-25 09:49:23.74202594 +0000 UTC m=+2696.944521077" watchObservedRunningTime="2025-11-25 09:49:23.746568083 +0000 UTC m=+2696.949063221" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.773585 4565 scope.go:117] "RemoveContainer" containerID="ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.777402 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.803992 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.817295 4565 scope.go:117] "RemoveContainer" containerID="8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.822050 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:23 crc kubenswrapper[4565]: E1125 09:49:23.822765 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="manila-scheduler" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.822838 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="manila-scheduler" Nov 25 09:49:23 crc kubenswrapper[4565]: E1125 09:49:23.822906 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="probe" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.822973 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="probe" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.823279 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="manila-scheduler" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.823339 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd7bff4-af86-442a-b108-dfa46e230085" containerName="probe" Nov 25 09:49:23 crc kubenswrapper[4565]: E1125 09:49:23.822096 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3\": container with ID starting with 8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3 not found: ID does not exist" containerID="8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.824633 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3"} err="failed to get container status \"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3\": rpc error: code = NotFound desc = could not find container \"8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3\": container with ID starting with 8100481176bcc7be5d7fe2920ecf756d06369e5db67af034f09890b7c638b5d3 not found: ID does not exist" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.824675 4565 scope.go:117] "RemoveContainer" containerID="ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.824766 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: E1125 09:49:23.825150 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc\": container with ID starting with ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc not found: ID does not exist" containerID="ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.825182 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc"} err="failed to get container status \"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc\": rpc error: code = NotFound desc = could not find container \"ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc\": container with ID starting with ecb242d8ba237a50659257a5780081dabb328054106928762b1dabdf8f71a3bc not found: ID does not exist" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.829858 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.841884 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.862877 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.863227 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.863266 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fef203b-c8bb-4fb3-9415-b042e6837bff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.863299 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlr4\" (UniqueName: \"kubernetes.io/projected/8fef203b-c8bb-4fb3-9415-b042e6837bff-kube-api-access-wvlr4\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.863324 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-scripts\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.863585 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.965605 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-scripts\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.965717 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.966190 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.966239 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.966266 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fef203b-c8bb-4fb3-9415-b042e6837bff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.966293 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlr4\" (UniqueName: \"kubernetes.io/projected/8fef203b-c8bb-4fb3-9415-b042e6837bff-kube-api-access-wvlr4\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.966481 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fef203b-c8bb-4fb3-9415-b042e6837bff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.969756 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-scripts\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.974602 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.991536 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.992474 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fef203b-c8bb-4fb3-9415-b042e6837bff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:23 crc kubenswrapper[4565]: I1125 09:49:23.996371 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlr4\" (UniqueName: \"kubernetes.io/projected/8fef203b-c8bb-4fb3-9415-b042e6837bff-kube-api-access-wvlr4\") pod \"manila-scheduler-0\" (UID: \"8fef203b-c8bb-4fb3-9415-b042e6837bff\") " pod="openstack/manila-scheduler-0" Nov 25 09:49:24 crc kubenswrapper[4565]: I1125 09:49:24.148780 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 25 09:49:24 crc kubenswrapper[4565]: W1125 09:49:24.665435 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fef203b_c8bb_4fb3_9415_b042e6837bff.slice/crio-0ebb8e9b30429aad091315898473d2d5f07af077998e7981d214d5db460f93c5 WatchSource:0}: Error finding container 0ebb8e9b30429aad091315898473d2d5f07af077998e7981d214d5db460f93c5: Status 404 returned error can't find the container with id 0ebb8e9b30429aad091315898473d2d5f07af077998e7981d214d5db460f93c5 Nov 25 09:49:24 crc kubenswrapper[4565]: I1125 09:49:24.667109 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 25 09:49:24 crc kubenswrapper[4565]: I1125 09:49:24.735016 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fef203b-c8bb-4fb3-9415-b042e6837bff","Type":"ContainerStarted","Data":"0ebb8e9b30429aad091315898473d2d5f07af077998e7981d214d5db460f93c5"} Nov 25 09:49:24 crc kubenswrapper[4565]: I1125 09:49:24.739470 4565 generic.go:334] "Generic (PLEG): container finished" podID="086c74f4-2c43-43a5-953d-0510a030706a" containerID="c87877e6c0907db46d4c90187fceb9facddccdea0b1f4443b811e646f09d59ae" exitCode=0 Nov 25 09:49:24 crc kubenswrapper[4565]: I1125 09:49:24.739702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerDied","Data":"c87877e6c0907db46d4c90187fceb9facddccdea0b1f4443b811e646f09d59ae"} Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.100606 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.100702 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.149598 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd7bff4-af86-442a-b108-dfa46e230085" path="/var/lib/kubelet/pods/acd7bff4-af86-442a-b108-dfa46e230085/volumes" Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.151565 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.153664 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.153762 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899" gracePeriod=600 Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.774272 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fef203b-c8bb-4fb3-9415-b042e6837bff","Type":"ContainerStarted","Data":"c958b7d0610dd791fa2c5e285c8581b3cae577c8bcd731f137aab5caf93e12c8"} Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.791867 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899" exitCode=0 Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.791964 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899"} Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.792010 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca"} Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.792030 4565 scope.go:117] "RemoveContainer" containerID="59c9abee363166c8d3bf5f720b54790885bf6f5ae24facf0fab9e0b465ff1843" Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.803769 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerStarted","Data":"ff3a1c778b2761e13b33b434c28d2db05e4324f2b8fcc852b3660f8241f88c69"} Nov 25 09:49:25 crc kubenswrapper[4565]: I1125 09:49:25.836742 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgp4j" podStartSLOduration=2.095727562 podStartE2EDuration="4.836726543s" podCreationTimestamp="2025-11-25 09:49:21 +0000 UTC" firstStartedPulling="2025-11-25 09:49:22.725263107 +0000 UTC m=+2695.927758246" lastFinishedPulling="2025-11-25 09:49:25.466262089 +0000 UTC m=+2698.668757227" observedRunningTime="2025-11-25 09:49:25.826228853 +0000 UTC m=+2699.028723991" watchObservedRunningTime="2025-11-25 09:49:25.836726543 +0000 UTC m=+2699.039221672" Nov 25 09:49:26 crc kubenswrapper[4565]: I1125 09:49:26.813702 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fef203b-c8bb-4fb3-9415-b042e6837bff","Type":"ContainerStarted","Data":"7bb49ba982c1f56985423c84afd99ce5944776817ec808e92a4697df98df7ab8"} Nov 25 09:49:27 crc kubenswrapper[4565]: I1125 09:49:27.727104 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:49:27 crc kubenswrapper[4565]: I1125 09:49:27.739616 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:49:27 crc kubenswrapper[4565]: I1125 09:49:27.763080 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.763058843 podStartE2EDuration="4.763058843s" podCreationTimestamp="2025-11-25 09:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:49:26.842205463 +0000 UTC m=+2700.044700601" watchObservedRunningTime="2025-11-25 09:49:27.763058843 +0000 UTC m=+2700.965553982" Nov 25 09:49:28 crc kubenswrapper[4565]: I1125 09:49:28.644975 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.152708 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.153180 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-central-agent" containerID="cri-o://3943420a92278ad409a1aeafd333ec0d3fad6752f6c8fdbf8784afac2143befc" gracePeriod=30 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.153595 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="proxy-httpd" containerID="cri-o://cf75c0f92f5215eb1e997c5ef18158006fd775c606bf1111db98102878a25fdb" gracePeriod=30 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.153663 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="sg-core" containerID="cri-o://f0c9375d2c051c77dda2c1c7677e4bfbdeb5c67e7668a75985d5cc5d20c1ab58" gracePeriod=30 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.153705 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-notification-agent" containerID="cri-o://3c9c8d70e1328ddd506d9962728090d1d03f37031dc47d277d468f272e16d180" gracePeriod=30 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852015 4565 generic.go:334] "Generic (PLEG): container finished" podID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerID="cf75c0f92f5215eb1e997c5ef18158006fd775c606bf1111db98102878a25fdb" exitCode=0 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852273 4565 generic.go:334] "Generic (PLEG): container finished" podID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerID="f0c9375d2c051c77dda2c1c7677e4bfbdeb5c67e7668a75985d5cc5d20c1ab58" exitCode=2 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852282 4565 generic.go:334] "Generic (PLEG): container finished" podID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerID="3c9c8d70e1328ddd506d9962728090d1d03f37031dc47d277d468f272e16d180" exitCode=0 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852288 4565 generic.go:334] "Generic (PLEG): container finished" podID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerID="3943420a92278ad409a1aeafd333ec0d3fad6752f6c8fdbf8784afac2143befc" exitCode=0 Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852310 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerDied","Data":"cf75c0f92f5215eb1e997c5ef18158006fd775c606bf1111db98102878a25fdb"} Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852341 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerDied","Data":"f0c9375d2c051c77dda2c1c7677e4bfbdeb5c67e7668a75985d5cc5d20c1ab58"} Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852352 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerDied","Data":"3c9c8d70e1328ddd506d9962728090d1d03f37031dc47d277d468f272e16d180"} Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.852364 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerDied","Data":"3943420a92278ad409a1aeafd333ec0d3fad6752f6c8fdbf8784afac2143befc"} Nov 25 09:49:29 crc kubenswrapper[4565]: I1125 09:49:29.946415 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.144456 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.144942 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.144975 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.144996 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145182 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145217 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145307 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb8nr\" (UniqueName: \"kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145499 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145574 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle\") pod \"bf889ed2-4f79-4b35-b7d9-860583df2614\" (UID: \"bf889ed2-4f79-4b35-b7d9-860583df2614\") " Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.145622 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.147975 4565 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.148173 4565 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf889ed2-4f79-4b35-b7d9-860583df2614-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.153829 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts" (OuterVolumeSpecName: "scripts") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.156751 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr" (OuterVolumeSpecName: "kube-api-access-vb8nr") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "kube-api-access-vb8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.190862 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.199276 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.225194 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.230556 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6549bb6ccb-qd7ll" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.250450 4565 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.250479 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb8nr\" (UniqueName: \"kubernetes.io/projected/bf889ed2-4f79-4b35-b7d9-860583df2614-kube-api-access-vb8nr\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.250489 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.250499 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.250508 4565 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.255217 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.266131 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data" (OuterVolumeSpecName: "config-data") pod "bf889ed2-4f79-4b35-b7d9-860583df2614" (UID: "bf889ed2-4f79-4b35-b7d9-860583df2614"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.323617 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.355519 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf889ed2-4f79-4b35-b7d9-860583df2614-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.687482 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.778971 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.910870 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="manila-share" containerID="cri-o://bc84a9f864de84083c43ec971810c5726d5e76174c8603a8355e2fa7acde3f49" gracePeriod=30 Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.911222 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.913979 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="probe" containerID="cri-o://f30cc3c7597fac6264afd18deacc583a2754c3f4e3f24458cbdb41f88c4b6eed" gracePeriod=30 Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.914240 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon-log" containerID="cri-o://9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806" gracePeriod=30 Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.914354 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf889ed2-4f79-4b35-b7d9-860583df2614","Type":"ContainerDied","Data":"92e60f2ef82205984c18d3d660b187d1fd8032e7714d8f000f059f4eb17a1894"} Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.914268 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" containerID="cri-o://6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a" gracePeriod=30 Nov 25 09:49:30 crc kubenswrapper[4565]: I1125 09:49:30.914436 4565 scope.go:117] "RemoveContainer" containerID="cf75c0f92f5215eb1e997c5ef18158006fd775c606bf1111db98102878a25fdb" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.044103 4565 scope.go:117] "RemoveContainer" containerID="f0c9375d2c051c77dda2c1c7677e4bfbdeb5c67e7668a75985d5cc5d20c1ab58" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.051874 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.099165 4565 scope.go:117] "RemoveContainer" containerID="3c9c8d70e1328ddd506d9962728090d1d03f37031dc47d277d468f272e16d180" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.099331 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.138841 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" path="/var/lib/kubelet/pods/bf889ed2-4f79-4b35-b7d9-860583df2614/volumes" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.139979 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:31 crc kubenswrapper[4565]: E1125 09:49:31.145536 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="proxy-httpd" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.145618 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="proxy-httpd" Nov 25 09:49:31 crc kubenswrapper[4565]: E1125 09:49:31.145696 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-notification-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.145756 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-notification-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: E1125 09:49:31.145844 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-central-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.145889 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-central-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: E1125 09:49:31.149777 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="sg-core" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.149861 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="sg-core" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.150226 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-central-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.150302 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="sg-core" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.150364 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="proxy-httpd" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.150438 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf889ed2-4f79-4b35-b7d9-860583df2614" containerName="ceilometer-notification-agent" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.153398 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.153644 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.158245 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.158427 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.158544 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.171923 4565 scope.go:117] "RemoveContainer" containerID="3943420a92278ad409a1aeafd333ec0d3fad6752f6c8fdbf8784afac2143befc" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.322744 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.323131 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-log-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.323156 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-config-data\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.323301 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.323402 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66t8b\" (UniqueName: \"kubernetes.io/projected/57d9524a-c577-4af2-a063-06c4928a3505-kube-api-access-66t8b\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.324299 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-scripts\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.324339 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-run-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.324377 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426187 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-scripts\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426250 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-run-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426290 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426357 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426398 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-log-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426416 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-config-data\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426475 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426514 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66t8b\" (UniqueName: \"kubernetes.io/projected/57d9524a-c577-4af2-a063-06c4928a3505-kube-api-access-66t8b\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.426845 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-run-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.427132 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57d9524a-c577-4af2-a063-06c4928a3505-log-httpd\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.439284 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.442251 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-scripts\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.444698 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.449780 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66t8b\" (UniqueName: \"kubernetes.io/projected/57d9524a-c577-4af2-a063-06c4928a3505-kube-api-access-66t8b\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.453732 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-config-data\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.464330 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57d9524a-c577-4af2-a063-06c4928a3505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57d9524a-c577-4af2-a063-06c4928a3505\") " pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.482965 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.541733 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.541883 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.940260 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.941143 4565 generic.go:334] "Generic (PLEG): container finished" podID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerID="f30cc3c7597fac6264afd18deacc583a2754c3f4e3f24458cbdb41f88c4b6eed" exitCode=0 Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.941184 4565 generic.go:334] "Generic (PLEG): container finished" podID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerID="bc84a9f864de84083c43ec971810c5726d5e76174c8603a8355e2fa7acde3f49" exitCode=1 Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.941624 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerDied","Data":"f30cc3c7597fac6264afd18deacc583a2754c3f4e3f24458cbdb41f88c4b6eed"} Nov 25 09:49:31 crc kubenswrapper[4565]: I1125 09:49:31.941703 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerDied","Data":"bc84a9f864de84083c43ec971810c5726d5e76174c8603a8355e2fa7acde3f49"} Nov 25 09:49:31 crc kubenswrapper[4565]: W1125 09:49:31.963427 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57d9524a_c577_4af2_a063_06c4928a3505.slice/crio-62455e9db94bfdb088af5082edb49202a2da7f5f613df7d23294c8ac409eeedc WatchSource:0}: Error finding container 62455e9db94bfdb088af5082edb49202a2da7f5f613df7d23294c8ac409eeedc: Status 404 returned error can't find the container with id 62455e9db94bfdb088af5082edb49202a2da7f5f613df7d23294c8ac409eeedc Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.249259 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451066 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxl7j\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451486 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451577 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451643 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451848 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.451946 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.452163 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.452252 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id\") pod \"eca2257b-f543-4f54-a433-dc3d852b6f59\" (UID: \"eca2257b-f543-4f54-a433-dc3d852b6f59\") " Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.452567 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.452679 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.452921 4565 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-var-lib-manila\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.453009 4565 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca2257b-f543-4f54-a433-dc3d852b6f59-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.457558 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.464448 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j" (OuterVolumeSpecName: "kube-api-access-mxl7j") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "kube-api-access-mxl7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.474362 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts" (OuterVolumeSpecName: "scripts") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.477992 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph" (OuterVolumeSpecName: "ceph") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.506801 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.561066 4565 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.561419 4565 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-ceph\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.561432 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxl7j\" (UniqueName: \"kubernetes.io/projected/eca2257b-f543-4f54-a433-dc3d852b6f59-kube-api-access-mxl7j\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.561477 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.561486 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.595554 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data" (OuterVolumeSpecName: "config-data") pod "eca2257b-f543-4f54-a433-dc3d852b6f59" (UID: "eca2257b-f543-4f54-a433-dc3d852b6f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.610539 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wgp4j" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="registry-server" probeResult="failure" output=< Nov 25 09:49:32 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:49:32 crc kubenswrapper[4565]: > Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.663849 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca2257b-f543-4f54-a433-dc3d852b6f59-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.954341 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57d9524a-c577-4af2-a063-06c4928a3505","Type":"ContainerStarted","Data":"6f6384ded585fb8c7a14e41931dcb5e1c7a7d66b84cb92d44c4a9f350d7eec01"} Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.954393 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57d9524a-c577-4af2-a063-06c4928a3505","Type":"ContainerStarted","Data":"62455e9db94bfdb088af5082edb49202a2da7f5f613df7d23294c8ac409eeedc"} Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.961311 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eca2257b-f543-4f54-a433-dc3d852b6f59","Type":"ContainerDied","Data":"93f7a3c45673ec6737a2de24cd328e7a06f1e277f30d265b6c0bb6ea1afd9853"} Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.961366 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:49:32 crc kubenswrapper[4565]: I1125 09:49:32.961408 4565 scope.go:117] "RemoveContainer" containerID="f30cc3c7597fac6264afd18deacc583a2754c3f4e3f24458cbdb41f88c4b6eed" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.028255 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.033147 4565 scope.go:117] "RemoveContainer" containerID="bc84a9f864de84083c43ec971810c5726d5e76174c8603a8355e2fa7acde3f49" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.033298 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.042241 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:33 crc kubenswrapper[4565]: E1125 09:49:33.042603 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="probe" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.042624 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="probe" Nov 25 09:49:33 crc kubenswrapper[4565]: E1125 09:49:33.042651 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="manila-share" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.042658 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="manila-share" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.042822 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="manila-share" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.042843 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" containerName="probe" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.044946 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.048449 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.058760 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.075889 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076031 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076055 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-scripts\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6rjf\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-kube-api-access-l6rjf\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076165 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076270 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076293 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-ceph\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.076387 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.108382 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca2257b-f543-4f54-a433-dc3d852b6f59" path="/var/lib/kubelet/pods/eca2257b-f543-4f54-a433-dc3d852b6f59/volumes" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.177867 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178347 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-scripts\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178390 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6rjf\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-kube-api-access-l6rjf\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178448 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178534 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178570 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-ceph\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178645 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178690 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.178768 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.179136 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d980f4-b1c6-4991-ae13-143dcb6bf453-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.182395 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-ceph\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.182750 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.182901 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.185550 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.185597 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d980f4-b1c6-4991-ae13-143dcb6bf453-scripts\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.197391 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6rjf\" (UniqueName: \"kubernetes.io/projected/60d980f4-b1c6-4991-ae13-143dcb6bf453-kube-api-access-l6rjf\") pod \"manila-share-share1-0\" (UID: \"60d980f4-b1c6-4991-ae13-143dcb6bf453\") " pod="openstack/manila-share-share1-0" Nov 25 09:49:33 crc kubenswrapper[4565]: I1125 09:49:33.369594 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 25 09:49:34 crc kubenswrapper[4565]: I1125 09:49:34.008260 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57d9524a-c577-4af2-a063-06c4928a3505","Type":"ContainerStarted","Data":"c7b3f754fecc9f412eecf1ca14b7759644f39e438cd620682cad5491fc9bf143"} Nov 25 09:49:34 crc kubenswrapper[4565]: I1125 09:49:34.049019 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 25 09:49:34 crc kubenswrapper[4565]: I1125 09:49:34.150383 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 25 09:49:34 crc kubenswrapper[4565]: I1125 09:49:34.537588 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:43408->10.217.0.242:8443: read: connection reset by peer" Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.026344 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57d9524a-c577-4af2-a063-06c4928a3505","Type":"ContainerStarted","Data":"d7aa18dabf0cf54b2edcadd551a8e4745813ff9ffa229af3fa2b7932b521cc64"} Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.029517 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60d980f4-b1c6-4991-ae13-143dcb6bf453","Type":"ContainerStarted","Data":"7dd64776047308c1c1276a7c9861797cf97ea0910f37a5730a13baaf6b033d48"} Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.031333 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60d980f4-b1c6-4991-ae13-143dcb6bf453","Type":"ContainerStarted","Data":"350eb54b5b4d89fc341dc10879e18845499e6ea5c20d265cc518090cb4b74e7d"} Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.031464 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60d980f4-b1c6-4991-ae13-143dcb6bf453","Type":"ContainerStarted","Data":"a7f678f929add9622c017cadca047121bda995e1cf3ccdbf75610390d205232d"} Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.042381 4565 generic.go:334] "Generic (PLEG): container finished" podID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerID="6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a" exitCode=0 Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.042462 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerDied","Data":"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a"} Nov 25 09:49:35 crc kubenswrapper[4565]: I1125 09:49:35.047855 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.047843082 podStartE2EDuration="2.047843082s" podCreationTimestamp="2025-11-25 09:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 09:49:35.046504849 +0000 UTC m=+2708.248999987" watchObservedRunningTime="2025-11-25 09:49:35.047843082 +0000 UTC m=+2708.250338220" Nov 25 09:49:36 crc kubenswrapper[4565]: I1125 09:49:36.062427 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57d9524a-c577-4af2-a063-06c4928a3505","Type":"ContainerStarted","Data":"148c84ae58ae61d326b23bf7271df54564ad269b2a6d3309b3b957b3e4f996d5"} Nov 25 09:49:36 crc kubenswrapper[4565]: I1125 09:49:36.064854 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 09:49:36 crc kubenswrapper[4565]: I1125 09:49:36.088439 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.74821503 podStartE2EDuration="5.088409509s" podCreationTimestamp="2025-11-25 09:49:31 +0000 UTC" firstStartedPulling="2025-11-25 09:49:31.985260167 +0000 UTC m=+2705.187755305" lastFinishedPulling="2025-11-25 09:49:35.325454647 +0000 UTC m=+2708.527949784" observedRunningTime="2025-11-25 09:49:36.080582664 +0000 UTC m=+2709.283077802" watchObservedRunningTime="2025-11-25 09:49:36.088409509 +0000 UTC m=+2709.290904647" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.216628 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.220897 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.232415 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.318697 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqsh\" (UniqueName: \"kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.318754 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.318846 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.420297 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqsh\" (UniqueName: \"kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.420360 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.420392 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.420987 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.421496 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.442233 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqsh\" (UniqueName: \"kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh\") pod \"redhat-marketplace-tdr7h\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:38 crc kubenswrapper[4565]: I1125 09:49:38.540749 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:39 crc kubenswrapper[4565]: I1125 09:49:39.041861 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:39 crc kubenswrapper[4565]: W1125 09:49:39.047357 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7600d4_ef29_40db_9a58_1965707ef008.slice/crio-8772ed5656fcfbda42c88e59cb348b6fe64bf1f8edb72fe576a2802eb677ab53 WatchSource:0}: Error finding container 8772ed5656fcfbda42c88e59cb348b6fe64bf1f8edb72fe576a2802eb677ab53: Status 404 returned error can't find the container with id 8772ed5656fcfbda42c88e59cb348b6fe64bf1f8edb72fe576a2802eb677ab53 Nov 25 09:49:39 crc kubenswrapper[4565]: I1125 09:49:39.112822 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerStarted","Data":"8772ed5656fcfbda42c88e59cb348b6fe64bf1f8edb72fe576a2802eb677ab53"} Nov 25 09:49:39 crc kubenswrapper[4565]: E1125 09:49:39.408176 4565 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7600d4_ef29_40db_9a58_1965707ef008.slice/crio-a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7600d4_ef29_40db_9a58_1965707ef008.slice/crio-conmon-a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998.scope\": RecentStats: unable to find data in memory cache]" Nov 25 09:49:40 crc kubenswrapper[4565]: I1125 09:49:40.112193 4565 generic.go:334] "Generic (PLEG): container finished" podID="6f7600d4-ef29-40db-9a58-1965707ef008" containerID="a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998" exitCode=0 Nov 25 09:49:40 crc kubenswrapper[4565]: I1125 09:49:40.112273 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerDied","Data":"a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998"} Nov 25 09:49:41 crc kubenswrapper[4565]: I1125 09:49:41.139276 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerStarted","Data":"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd"} Nov 25 09:49:41 crc kubenswrapper[4565]: I1125 09:49:41.583838 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:41 crc kubenswrapper[4565]: I1125 09:49:41.628203 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:42 crc kubenswrapper[4565]: I1125 09:49:42.149138 4565 generic.go:334] "Generic (PLEG): container finished" podID="6f7600d4-ef29-40db-9a58-1965707ef008" containerID="7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd" exitCode=0 Nov 25 09:49:42 crc kubenswrapper[4565]: I1125 09:49:42.149224 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerDied","Data":"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd"} Nov 25 09:49:43 crc kubenswrapper[4565]: I1125 09:49:43.164841 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerStarted","Data":"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9"} Nov 25 09:49:43 crc kubenswrapper[4565]: I1125 09:49:43.185355 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tdr7h" podStartSLOduration=2.696451595 podStartE2EDuration="5.185326222s" podCreationTimestamp="2025-11-25 09:49:38 +0000 UTC" firstStartedPulling="2025-11-25 09:49:40.115019376 +0000 UTC m=+2713.317514514" lastFinishedPulling="2025-11-25 09:49:42.603894002 +0000 UTC m=+2715.806389141" observedRunningTime="2025-11-25 09:49:43.182045027 +0000 UTC m=+2716.384540165" watchObservedRunningTime="2025-11-25 09:49:43.185326222 +0000 UTC m=+2716.387821360" Nov 25 09:49:43 crc kubenswrapper[4565]: I1125 09:49:43.370130 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 25 09:49:44 crc kubenswrapper[4565]: I1125 09:49:44.362263 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.001181 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.001636 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wgp4j" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="registry-server" containerID="cri-o://ff3a1c778b2761e13b33b434c28d2db05e4324f2b8fcc852b3660f8241f88c69" gracePeriod=2 Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.185784 4565 generic.go:334] "Generic (PLEG): container finished" podID="086c74f4-2c43-43a5-953d-0510a030706a" containerID="ff3a1c778b2761e13b33b434c28d2db05e4324f2b8fcc852b3660f8241f88c69" exitCode=0 Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.185846 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerDied","Data":"ff3a1c778b2761e13b33b434c28d2db05e4324f2b8fcc852b3660f8241f88c69"} Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.469269 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.508433 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jx26\" (UniqueName: \"kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26\") pod \"086c74f4-2c43-43a5-953d-0510a030706a\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.508672 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content\") pod \"086c74f4-2c43-43a5-953d-0510a030706a\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.508746 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities\") pod \"086c74f4-2c43-43a5-953d-0510a030706a\" (UID: \"086c74f4-2c43-43a5-953d-0510a030706a\") " Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.509405 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities" (OuterVolumeSpecName: "utilities") pod "086c74f4-2c43-43a5-953d-0510a030706a" (UID: "086c74f4-2c43-43a5-953d-0510a030706a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.516169 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26" (OuterVolumeSpecName: "kube-api-access-5jx26") pod "086c74f4-2c43-43a5-953d-0510a030706a" (UID: "086c74f4-2c43-43a5-953d-0510a030706a"). InnerVolumeSpecName "kube-api-access-5jx26". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.523586 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.571810 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "086c74f4-2c43-43a5-953d-0510a030706a" (UID: "086c74f4-2c43-43a5-953d-0510a030706a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.612421 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.612456 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086c74f4-2c43-43a5-953d-0510a030706a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:45 crc kubenswrapper[4565]: I1125 09:49:45.612468 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jx26\" (UniqueName: \"kubernetes.io/projected/086c74f4-2c43-43a5-953d-0510a030706a-kube-api-access-5jx26\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.196115 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgp4j" event={"ID":"086c74f4-2c43-43a5-953d-0510a030706a","Type":"ContainerDied","Data":"301c7b2491856978feb2ba288444508284e4020aed831d71b35495a720e22ad7"} Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.196165 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgp4j" Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.196176 4565 scope.go:117] "RemoveContainer" containerID="ff3a1c778b2761e13b33b434c28d2db05e4324f2b8fcc852b3660f8241f88c69" Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.227439 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.228377 4565 scope.go:117] "RemoveContainer" containerID="c87877e6c0907db46d4c90187fceb9facddccdea0b1f4443b811e646f09d59ae" Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.235973 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wgp4j"] Nov 25 09:49:46 crc kubenswrapper[4565]: I1125 09:49:46.253911 4565 scope.go:117] "RemoveContainer" containerID="34651906671a225f5cd738dca6298664bf395d6b408885ec69f16e38c608625a" Nov 25 09:49:47 crc kubenswrapper[4565]: I1125 09:49:47.106748 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086c74f4-2c43-43a5-953d-0510a030706a" path="/var/lib/kubelet/pods/086c74f4-2c43-43a5-953d-0510a030706a/volumes" Nov 25 09:49:48 crc kubenswrapper[4565]: I1125 09:49:48.541617 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:48 crc kubenswrapper[4565]: I1125 09:49:48.542015 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:48 crc kubenswrapper[4565]: I1125 09:49:48.581876 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:49 crc kubenswrapper[4565]: I1125 09:49:49.269164 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:50 crc kubenswrapper[4565]: I1125 09:49:50.201034 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.247768 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tdr7h" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="registry-server" containerID="cri-o://331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9" gracePeriod=2 Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.710076 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.747826 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqsh\" (UniqueName: \"kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh\") pod \"6f7600d4-ef29-40db-9a58-1965707ef008\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.747890 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content\") pod \"6f7600d4-ef29-40db-9a58-1965707ef008\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.747940 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities\") pod \"6f7600d4-ef29-40db-9a58-1965707ef008\" (UID: \"6f7600d4-ef29-40db-9a58-1965707ef008\") " Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.749105 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities" (OuterVolumeSpecName: "utilities") pod "6f7600d4-ef29-40db-9a58-1965707ef008" (UID: "6f7600d4-ef29-40db-9a58-1965707ef008"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.761065 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh" (OuterVolumeSpecName: "kube-api-access-vrqsh") pod "6f7600d4-ef29-40db-9a58-1965707ef008" (UID: "6f7600d4-ef29-40db-9a58-1965707ef008"). InnerVolumeSpecName "kube-api-access-vrqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.762461 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f7600d4-ef29-40db-9a58-1965707ef008" (UID: "6f7600d4-ef29-40db-9a58-1965707ef008"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.850082 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqsh\" (UniqueName: \"kubernetes.io/projected/6f7600d4-ef29-40db-9a58-1965707ef008-kube-api-access-vrqsh\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.850123 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:51 crc kubenswrapper[4565]: I1125 09:49:51.850137 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7600d4-ef29-40db-9a58-1965707ef008-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.257452 4565 generic.go:334] "Generic (PLEG): container finished" podID="6f7600d4-ef29-40db-9a58-1965707ef008" containerID="331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9" exitCode=0 Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.257510 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerDied","Data":"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9"} Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.257600 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdr7h" event={"ID":"6f7600d4-ef29-40db-9a58-1965707ef008","Type":"ContainerDied","Data":"8772ed5656fcfbda42c88e59cb348b6fe64bf1f8edb72fe576a2802eb677ab53"} Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.257619 4565 scope.go:117] "RemoveContainer" containerID="331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.257799 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdr7h" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.287368 4565 scope.go:117] "RemoveContainer" containerID="7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.298216 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.305422 4565 scope.go:117] "RemoveContainer" containerID="a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.307988 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdr7h"] Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.349853 4565 scope.go:117] "RemoveContainer" containerID="331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9" Nov 25 09:49:52 crc kubenswrapper[4565]: E1125 09:49:52.350385 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9\": container with ID starting with 331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9 not found: ID does not exist" containerID="331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.350436 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9"} err="failed to get container status \"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9\": rpc error: code = NotFound desc = could not find container \"331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9\": container with ID starting with 331e4fb4960b3189d54e145c1feefc75e8535d85fd6ec7f25fd081df6d32d1e9 not found: ID does not exist" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.350465 4565 scope.go:117] "RemoveContainer" containerID="7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd" Nov 25 09:49:52 crc kubenswrapper[4565]: E1125 09:49:52.350954 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd\": container with ID starting with 7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd not found: ID does not exist" containerID="7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.351056 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd"} err="failed to get container status \"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd\": rpc error: code = NotFound desc = could not find container \"7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd\": container with ID starting with 7dba09007d0ef81f83efb29cf60f0586f496ea187d6a7891daca05abb1eb82fd not found: ID does not exist" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.351128 4565 scope.go:117] "RemoveContainer" containerID="a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998" Nov 25 09:49:52 crc kubenswrapper[4565]: E1125 09:49:52.351458 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998\": container with ID starting with a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998 not found: ID does not exist" containerID="a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998" Nov 25 09:49:52 crc kubenswrapper[4565]: I1125 09:49:52.351533 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998"} err="failed to get container status \"a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998\": rpc error: code = NotFound desc = could not find container \"a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998\": container with ID starting with a85f9a89a00cd0539b6fe90adb0fc8128803feb8bd299deff9eeffad5d127998 not found: ID does not exist" Nov 25 09:49:53 crc kubenswrapper[4565]: I1125 09:49:53.106801 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" path="/var/lib/kubelet/pods/6f7600d4-ef29-40db-9a58-1965707ef008/volumes" Nov 25 09:49:54 crc kubenswrapper[4565]: I1125 09:49:54.361844 4565 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6848d4c5cd-8fv74" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Nov 25 09:49:54 crc kubenswrapper[4565]: I1125 09:49:54.362236 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:49:54 crc kubenswrapper[4565]: I1125 09:49:54.877649 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.286136 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.336282 4565 generic.go:334] "Generic (PLEG): container finished" podID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerID="9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806" exitCode=137 Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.336325 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerDied","Data":"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806"} Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.336350 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6848d4c5cd-8fv74" event={"ID":"79e96477-2f1b-49e7-89f9-d6a18694af63","Type":"ContainerDied","Data":"37c3dc2a69d6edbc8600eb25c60b3f8b803a590a67a38f3dcf74c28b21357123"} Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.336367 4565 scope.go:117] "RemoveContainer" containerID="6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.336331 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6848d4c5cd-8fv74" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451518 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451649 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451735 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcgqb\" (UniqueName: \"kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451768 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451816 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451922 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.451978 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts\") pod \"79e96477-2f1b-49e7-89f9-d6a18694af63\" (UID: \"79e96477-2f1b-49e7-89f9-d6a18694af63\") " Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.452964 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs" (OuterVolumeSpecName: "logs") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.460083 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb" (OuterVolumeSpecName: "kube-api-access-wcgqb") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "kube-api-access-wcgqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.471581 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.475433 4565 scope.go:117] "RemoveContainer" containerID="9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.477025 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data" (OuterVolumeSpecName: "config-data") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.478606 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.480449 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts" (OuterVolumeSpecName: "scripts") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.498403 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "79e96477-2f1b-49e7-89f9-d6a18694af63" (UID: "79e96477-2f1b-49e7-89f9-d6a18694af63"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.501015 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557197 4565 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557232 4565 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557242 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcgqb\" (UniqueName: \"kubernetes.io/projected/79e96477-2f1b-49e7-89f9-d6a18694af63-kube-api-access-wcgqb\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557252 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e96477-2f1b-49e7-89f9-d6a18694af63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557262 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557271 4565 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e96477-2f1b-49e7-89f9-d6a18694af63-logs\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.557279 4565 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79e96477-2f1b-49e7-89f9-d6a18694af63-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.571339 4565 scope.go:117] "RemoveContainer" containerID="6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a" Nov 25 09:50:01 crc kubenswrapper[4565]: E1125 09:50:01.571983 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a\": container with ID starting with 6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a not found: ID does not exist" containerID="6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.572049 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a"} err="failed to get container status \"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a\": rpc error: code = NotFound desc = could not find container \"6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a\": container with ID starting with 6dab57c95f45fe3dc539cfa46a834007bafce1552d436157f5484e112416ef6a not found: ID does not exist" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.572084 4565 scope.go:117] "RemoveContainer" containerID="9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806" Nov 25 09:50:01 crc kubenswrapper[4565]: E1125 09:50:01.573365 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806\": container with ID starting with 9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806 not found: ID does not exist" containerID="9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.573420 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806"} err="failed to get container status \"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806\": rpc error: code = NotFound desc = could not find container \"9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806\": container with ID starting with 9eb2815b0f68a95e51f96b7a202d3b8d46040bf18429f85713e40a2240b1f806 not found: ID does not exist" Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.673625 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:50:01 crc kubenswrapper[4565]: I1125 09:50:01.680038 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6848d4c5cd-8fv74"] Nov 25 09:50:03 crc kubenswrapper[4565]: I1125 09:50:03.108120 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" path="/var/lib/kubelet/pods/79e96477-2f1b-49e7-89f9-d6a18694af63/volumes" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.837277 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838564 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838584 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838600 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="extract-content" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838606 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="extract-content" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838626 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="extract-content" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838632 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="extract-content" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838643 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon-log" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838649 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon-log" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838658 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838664 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838678 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838684 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838697 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="extract-utilities" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838702 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="extract-utilities" Nov 25 09:50:45 crc kubenswrapper[4565]: E1125 09:50:45.838725 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="extract-utilities" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.838732 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="extract-utilities" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.839287 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7600d4-ef29-40db-9a58-1965707ef008" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.839305 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon-log" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.839350 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e96477-2f1b-49e7-89f9-d6a18694af63" containerName="horizon" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.839378 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="086c74f4-2c43-43a5-953d-0510a030706a" containerName="registry-server" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.841343 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.843623 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.843759 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.843876 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.844157 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.844485 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.844494 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.845214 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2kmjl" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.854944 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946128 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946220 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946257 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gbs\" (UniqueName: \"kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946308 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946440 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946540 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946674 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946709 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.946743 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.947326 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.947902 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:45 crc kubenswrapper[4565]: I1125 09:50:45.952641 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.049771 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.049849 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gbs\" (UniqueName: \"kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.049909 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.049964 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.050043 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.050082 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.050678 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.050717 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.050840 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.054896 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.061546 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.066467 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gbs\" (UniqueName: \"kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.075460 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.172460 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.606245 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 09:50:46 crc kubenswrapper[4565]: I1125 09:50:46.828141 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2","Type":"ContainerStarted","Data":"3c39a243e4a2d0f71c835739bdb2af34b848e9e77320c5b9790702e7708926ac"} Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.393372 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.401171 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.408966 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.457986 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.458107 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhp2\" (UniqueName: \"kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.458249 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.562383 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.562520 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhp2\" (UniqueName: \"kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.562669 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.563464 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.566129 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.597703 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhp2\" (UniqueName: \"kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2\") pod \"redhat-operators-jqkkv\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:52 crc kubenswrapper[4565]: I1125 09:50:52.727558 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:50:55 crc kubenswrapper[4565]: I1125 09:50:55.359316 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:50:55 crc kubenswrapper[4565]: I1125 09:50:55.961603 4565 generic.go:334] "Generic (PLEG): container finished" podID="80049c09-f854-4777-aeae-08234d72e474" containerID="03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366" exitCode=0 Nov 25 09:50:55 crc kubenswrapper[4565]: I1125 09:50:55.961712 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerDied","Data":"03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366"} Nov 25 09:50:55 crc kubenswrapper[4565]: I1125 09:50:55.962083 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerStarted","Data":"e603679177750b21f44e7f975df61d871ec582e6352c967f4ac4a12dda149cd6"} Nov 25 09:50:55 crc kubenswrapper[4565]: I1125 09:50:55.965764 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 09:50:58 crc kubenswrapper[4565]: I1125 09:50:58.002953 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerStarted","Data":"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd"} Nov 25 09:51:00 crc kubenswrapper[4565]: I1125 09:51:00.049223 4565 generic.go:334] "Generic (PLEG): container finished" podID="80049c09-f854-4777-aeae-08234d72e474" containerID="3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd" exitCode=0 Nov 25 09:51:00 crc kubenswrapper[4565]: I1125 09:51:00.049640 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerDied","Data":"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd"} Nov 25 09:51:01 crc kubenswrapper[4565]: I1125 09:51:01.062755 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerStarted","Data":"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6"} Nov 25 09:51:01 crc kubenswrapper[4565]: I1125 09:51:01.082705 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jqkkv" podStartSLOduration=4.443286917 podStartE2EDuration="9.082684086s" podCreationTimestamp="2025-11-25 09:50:52 +0000 UTC" firstStartedPulling="2025-11-25 09:50:55.965505855 +0000 UTC m=+2789.168000993" lastFinishedPulling="2025-11-25 09:51:00.604903024 +0000 UTC m=+2793.807398162" observedRunningTime="2025-11-25 09:51:01.076272879 +0000 UTC m=+2794.278768017" watchObservedRunningTime="2025-11-25 09:51:01.082684086 +0000 UTC m=+2794.285179224" Nov 25 09:51:02 crc kubenswrapper[4565]: I1125 09:51:02.728722 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:02 crc kubenswrapper[4565]: I1125 09:51:02.729255 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:03 crc kubenswrapper[4565]: I1125 09:51:03.795966 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqkkv" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" probeResult="failure" output=< Nov 25 09:51:03 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:51:03 crc kubenswrapper[4565]: > Nov 25 09:51:13 crc kubenswrapper[4565]: I1125 09:51:13.787378 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqkkv" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" probeResult="failure" output=< Nov 25 09:51:13 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 09:51:13 crc kubenswrapper[4565]: > Nov 25 09:51:22 crc kubenswrapper[4565]: I1125 09:51:22.771388 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:22 crc kubenswrapper[4565]: I1125 09:51:22.841789 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:23 crc kubenswrapper[4565]: E1125 09:51:23.532265 4565 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 25 09:51:23 crc kubenswrapper[4565]: E1125 09:51:23.533729 4565 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6gbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(a3071a8a-a30b-4b2b-aea0-5882f4eff1b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 09:51:23 crc kubenswrapper[4565]: E1125 09:51:23.534991 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" Nov 25 09:51:23 crc kubenswrapper[4565]: I1125 09:51:23.589063 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:51:24 crc kubenswrapper[4565]: I1125 09:51:24.402343 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jqkkv" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" containerID="cri-o://e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6" gracePeriod=2 Nov 25 09:51:24 crc kubenswrapper[4565]: E1125 09:51:24.404253 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" Nov 25 09:51:24 crc kubenswrapper[4565]: I1125 09:51:24.905114 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.025426 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities\") pod \"80049c09-f854-4777-aeae-08234d72e474\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.025598 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content\") pod \"80049c09-f854-4777-aeae-08234d72e474\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.025785 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzhp2\" (UniqueName: \"kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2\") pod \"80049c09-f854-4777-aeae-08234d72e474\" (UID: \"80049c09-f854-4777-aeae-08234d72e474\") " Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.026091 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities" (OuterVolumeSpecName: "utilities") pod "80049c09-f854-4777-aeae-08234d72e474" (UID: "80049c09-f854-4777-aeae-08234d72e474"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.026657 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.034111 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2" (OuterVolumeSpecName: "kube-api-access-hzhp2") pod "80049c09-f854-4777-aeae-08234d72e474" (UID: "80049c09-f854-4777-aeae-08234d72e474"). InnerVolumeSpecName "kube-api-access-hzhp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.100035 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.100490 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.108985 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80049c09-f854-4777-aeae-08234d72e474" (UID: "80049c09-f854-4777-aeae-08234d72e474"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.129243 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzhp2\" (UniqueName: \"kubernetes.io/projected/80049c09-f854-4777-aeae-08234d72e474-kube-api-access-hzhp2\") on node \"crc\" DevicePath \"\"" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.129281 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80049c09-f854-4777-aeae-08234d72e474-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.414743 4565 generic.go:334] "Generic (PLEG): container finished" podID="80049c09-f854-4777-aeae-08234d72e474" containerID="e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6" exitCode=0 Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.414799 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerDied","Data":"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6"} Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.414837 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqkkv" event={"ID":"80049c09-f854-4777-aeae-08234d72e474","Type":"ContainerDied","Data":"e603679177750b21f44e7f975df61d871ec582e6352c967f4ac4a12dda149cd6"} Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.414875 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqkkv" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.414877 4565 scope.go:117] "RemoveContainer" containerID="e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.444246 4565 scope.go:117] "RemoveContainer" containerID="3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.455220 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.464497 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jqkkv"] Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.466056 4565 scope.go:117] "RemoveContainer" containerID="03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.499644 4565 scope.go:117] "RemoveContainer" containerID="e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6" Nov 25 09:51:25 crc kubenswrapper[4565]: E1125 09:51:25.500253 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6\": container with ID starting with e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6 not found: ID does not exist" containerID="e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.500351 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6"} err="failed to get container status \"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6\": rpc error: code = NotFound desc = could not find container \"e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6\": container with ID starting with e93593cc7631ffa02410dc24a3e039b6e0358abd0038b7ee71bbce2743ef5be6 not found: ID does not exist" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.500441 4565 scope.go:117] "RemoveContainer" containerID="3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd" Nov 25 09:51:25 crc kubenswrapper[4565]: E1125 09:51:25.500861 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd\": container with ID starting with 3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd not found: ID does not exist" containerID="3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.500898 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd"} err="failed to get container status \"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd\": rpc error: code = NotFound desc = could not find container \"3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd\": container with ID starting with 3a6979dea6554aa59c4a548d458082ca372e318fe10e6e936648b4bdde48c8fd not found: ID does not exist" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.500939 4565 scope.go:117] "RemoveContainer" containerID="03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366" Nov 25 09:51:25 crc kubenswrapper[4565]: E1125 09:51:25.501294 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366\": container with ID starting with 03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366 not found: ID does not exist" containerID="03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366" Nov 25 09:51:25 crc kubenswrapper[4565]: I1125 09:51:25.501378 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366"} err="failed to get container status \"03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366\": rpc error: code = NotFound desc = could not find container \"03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366\": container with ID starting with 03bfebee031b660728b5652e182b46c0a26a084be0e2ff83aaac683b7e4be366 not found: ID does not exist" Nov 25 09:51:27 crc kubenswrapper[4565]: I1125 09:51:27.122654 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80049c09-f854-4777-aeae-08234d72e474" path="/var/lib/kubelet/pods/80049c09-f854-4777-aeae-08234d72e474/volumes" Nov 25 09:51:39 crc kubenswrapper[4565]: I1125 09:51:39.615247 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 25 09:51:40 crc kubenswrapper[4565]: I1125 09:51:40.564494 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2","Type":"ContainerStarted","Data":"44d0883560c7569ff51bdc3f478ac620a5610e14f048dd56ba5ddd8a0b24fd52"} Nov 25 09:51:40 crc kubenswrapper[4565]: I1125 09:51:40.598283 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.597078258 podStartE2EDuration="56.598257134s" podCreationTimestamp="2025-11-25 09:50:44 +0000 UTC" firstStartedPulling="2025-11-25 09:50:46.610229912 +0000 UTC m=+2779.812725040" lastFinishedPulling="2025-11-25 09:51:39.611408778 +0000 UTC m=+2832.813903916" observedRunningTime="2025-11-25 09:51:40.586740142 +0000 UTC m=+2833.789235280" watchObservedRunningTime="2025-11-25 09:51:40.598257134 +0000 UTC m=+2833.800752272" Nov 25 09:51:55 crc kubenswrapper[4565]: I1125 09:51:55.099441 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:51:55 crc kubenswrapper[4565]: I1125 09:51:55.100300 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:52:25 crc kubenswrapper[4565]: I1125 09:52:25.101108 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:52:25 crc kubenswrapper[4565]: I1125 09:52:25.101871 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 09:52:25 crc kubenswrapper[4565]: I1125 09:52:25.107182 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 09:52:25 crc kubenswrapper[4565]: I1125 09:52:25.107594 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 09:52:25 crc kubenswrapper[4565]: I1125 09:52:25.107667 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" gracePeriod=600 Nov 25 09:52:25 crc kubenswrapper[4565]: E1125 09:52:25.246268 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:52:26 crc kubenswrapper[4565]: I1125 09:52:26.039907 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" exitCode=0 Nov 25 09:52:26 crc kubenswrapper[4565]: I1125 09:52:26.039993 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca"} Nov 25 09:52:26 crc kubenswrapper[4565]: I1125 09:52:26.040072 4565 scope.go:117] "RemoveContainer" containerID="dbd04505ddc8880f571911ca07bdffbff9a145427b2f29adc42f041a9dc56899" Nov 25 09:52:26 crc kubenswrapper[4565]: I1125 09:52:26.041168 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:52:26 crc kubenswrapper[4565]: E1125 09:52:26.042173 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.033109 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:35 crc kubenswrapper[4565]: E1125 09:52:35.037171 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.037215 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" Nov 25 09:52:35 crc kubenswrapper[4565]: E1125 09:52:35.037247 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="extract-utilities" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.037259 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="extract-utilities" Nov 25 09:52:35 crc kubenswrapper[4565]: E1125 09:52:35.037274 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="extract-content" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.037280 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="extract-content" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.037510 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="80049c09-f854-4777-aeae-08234d72e474" containerName="registry-server" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.039262 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.054717 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.107229 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.107327 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.107390 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45x4l\" (UniqueName: \"kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.209012 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.209079 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45x4l\" (UniqueName: \"kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.209181 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.209600 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.209680 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.233830 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45x4l\" (UniqueName: \"kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l\") pod \"certified-operators-btsx5\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.361824 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:35 crc kubenswrapper[4565]: I1125 09:52:35.811886 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:36 crc kubenswrapper[4565]: I1125 09:52:36.140211 4565 generic.go:334] "Generic (PLEG): container finished" podID="c754c8e3-63d9-4702-a677-24a567e06bed" containerID="7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1" exitCode=0 Nov 25 09:52:36 crc kubenswrapper[4565]: I1125 09:52:36.141088 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerDied","Data":"7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1"} Nov 25 09:52:36 crc kubenswrapper[4565]: I1125 09:52:36.141303 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerStarted","Data":"2fc078af8c2bbae3a73c03bfebacecb60b7247da74e53a0715dda2a5efa864dd"} Nov 25 09:52:37 crc kubenswrapper[4565]: I1125 09:52:37.151054 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerStarted","Data":"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a"} Nov 25 09:52:38 crc kubenswrapper[4565]: I1125 09:52:38.163755 4565 generic.go:334] "Generic (PLEG): container finished" podID="c754c8e3-63d9-4702-a677-24a567e06bed" containerID="e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a" exitCode=0 Nov 25 09:52:38 crc kubenswrapper[4565]: I1125 09:52:38.165248 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerDied","Data":"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a"} Nov 25 09:52:39 crc kubenswrapper[4565]: I1125 09:52:39.175143 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerStarted","Data":"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76"} Nov 25 09:52:39 crc kubenswrapper[4565]: I1125 09:52:39.198580 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-btsx5" podStartSLOduration=1.700572027 podStartE2EDuration="4.198558331s" podCreationTimestamp="2025-11-25 09:52:35 +0000 UTC" firstStartedPulling="2025-11-25 09:52:36.141826237 +0000 UTC m=+2889.344321376" lastFinishedPulling="2025-11-25 09:52:38.639812541 +0000 UTC m=+2891.842307680" observedRunningTime="2025-11-25 09:52:39.19153456 +0000 UTC m=+2892.394029698" watchObservedRunningTime="2025-11-25 09:52:39.198558331 +0000 UTC m=+2892.401053469" Nov 25 09:52:40 crc kubenswrapper[4565]: I1125 09:52:40.097857 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:52:40 crc kubenswrapper[4565]: E1125 09:52:40.098752 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:52:45 crc kubenswrapper[4565]: I1125 09:52:45.362184 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:45 crc kubenswrapper[4565]: I1125 09:52:45.362670 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:45 crc kubenswrapper[4565]: I1125 09:52:45.404546 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:46 crc kubenswrapper[4565]: I1125 09:52:46.286649 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:46 crc kubenswrapper[4565]: I1125 09:52:46.330189 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.259699 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-btsx5" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="registry-server" containerID="cri-o://274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76" gracePeriod=2 Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.725975 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.745474 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities\") pod \"c754c8e3-63d9-4702-a677-24a567e06bed\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.745542 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content\") pod \"c754c8e3-63d9-4702-a677-24a567e06bed\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.745755 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45x4l\" (UniqueName: \"kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l\") pod \"c754c8e3-63d9-4702-a677-24a567e06bed\" (UID: \"c754c8e3-63d9-4702-a677-24a567e06bed\") " Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.748120 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities" (OuterVolumeSpecName: "utilities") pod "c754c8e3-63d9-4702-a677-24a567e06bed" (UID: "c754c8e3-63d9-4702-a677-24a567e06bed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.768172 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l" (OuterVolumeSpecName: "kube-api-access-45x4l") pod "c754c8e3-63d9-4702-a677-24a567e06bed" (UID: "c754c8e3-63d9-4702-a677-24a567e06bed"). InnerVolumeSpecName "kube-api-access-45x4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.800428 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c754c8e3-63d9-4702-a677-24a567e06bed" (UID: "c754c8e3-63d9-4702-a677-24a567e06bed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.848415 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.848446 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c754c8e3-63d9-4702-a677-24a567e06bed-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 09:52:48 crc kubenswrapper[4565]: I1125 09:52:48.848462 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45x4l\" (UniqueName: \"kubernetes.io/projected/c754c8e3-63d9-4702-a677-24a567e06bed-kube-api-access-45x4l\") on node \"crc\" DevicePath \"\"" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.278338 4565 generic.go:334] "Generic (PLEG): container finished" podID="c754c8e3-63d9-4702-a677-24a567e06bed" containerID="274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76" exitCode=0 Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.278404 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerDied","Data":"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76"} Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.278446 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-btsx5" event={"ID":"c754c8e3-63d9-4702-a677-24a567e06bed","Type":"ContainerDied","Data":"2fc078af8c2bbae3a73c03bfebacecb60b7247da74e53a0715dda2a5efa864dd"} Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.278468 4565 scope.go:117] "RemoveContainer" containerID="274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.278629 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-btsx5" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.304912 4565 scope.go:117] "RemoveContainer" containerID="e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.306071 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.313077 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-btsx5"] Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.326122 4565 scope.go:117] "RemoveContainer" containerID="7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.356826 4565 scope.go:117] "RemoveContainer" containerID="274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76" Nov 25 09:52:49 crc kubenswrapper[4565]: E1125 09:52:49.358439 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76\": container with ID starting with 274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76 not found: ID does not exist" containerID="274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.358473 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76"} err="failed to get container status \"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76\": rpc error: code = NotFound desc = could not find container \"274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76\": container with ID starting with 274fac99549a56e5f180082ad5f5a8716f714edf3fcb557652ecbf8bc3360c76 not found: ID does not exist" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.358496 4565 scope.go:117] "RemoveContainer" containerID="e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a" Nov 25 09:52:49 crc kubenswrapper[4565]: E1125 09:52:49.358695 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a\": container with ID starting with e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a not found: ID does not exist" containerID="e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.358720 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a"} err="failed to get container status \"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a\": rpc error: code = NotFound desc = could not find container \"e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a\": container with ID starting with e5ff306fa0b6bf215dd88a8705cfa2acc2a035aef2a6835c9e27cb2f0c1df34a not found: ID does not exist" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.358735 4565 scope.go:117] "RemoveContainer" containerID="7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1" Nov 25 09:52:49 crc kubenswrapper[4565]: E1125 09:52:49.359000 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1\": container with ID starting with 7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1 not found: ID does not exist" containerID="7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1" Nov 25 09:52:49 crc kubenswrapper[4565]: I1125 09:52:49.359036 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1"} err="failed to get container status \"7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1\": rpc error: code = NotFound desc = could not find container \"7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1\": container with ID starting with 7b1977f93183cf0ad9dfdfbb71f71b71e8daa786c0f91870ef072ac987a2b6c1 not found: ID does not exist" Nov 25 09:52:51 crc kubenswrapper[4565]: I1125 09:52:51.108220 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" path="/var/lib/kubelet/pods/c754c8e3-63d9-4702-a677-24a567e06bed/volumes" Nov 25 09:52:53 crc kubenswrapper[4565]: I1125 09:52:53.099396 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:52:53 crc kubenswrapper[4565]: E1125 09:52:53.100326 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:53:05 crc kubenswrapper[4565]: I1125 09:53:05.097798 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:53:05 crc kubenswrapper[4565]: E1125 09:53:05.098799 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:53:19 crc kubenswrapper[4565]: I1125 09:53:19.097847 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:53:19 crc kubenswrapper[4565]: E1125 09:53:19.099102 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:53:31 crc kubenswrapper[4565]: I1125 09:53:31.097447 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:53:31 crc kubenswrapper[4565]: E1125 09:53:31.098540 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:53:46 crc kubenswrapper[4565]: I1125 09:53:46.097920 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:53:46 crc kubenswrapper[4565]: E1125 09:53:46.098886 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:53:58 crc kubenswrapper[4565]: I1125 09:53:58.098283 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:53:58 crc kubenswrapper[4565]: E1125 09:53:58.100218 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:54:13 crc kubenswrapper[4565]: I1125 09:54:13.098225 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:54:13 crc kubenswrapper[4565]: E1125 09:54:13.098886 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:54:28 crc kubenswrapper[4565]: I1125 09:54:28.099843 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:54:28 crc kubenswrapper[4565]: E1125 09:54:28.103677 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:54:43 crc kubenswrapper[4565]: I1125 09:54:43.097661 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:54:43 crc kubenswrapper[4565]: E1125 09:54:43.098523 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:54:58 crc kubenswrapper[4565]: I1125 09:54:58.098761 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:54:58 crc kubenswrapper[4565]: E1125 09:54:58.099629 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:55:13 crc kubenswrapper[4565]: I1125 09:55:13.097780 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:55:13 crc kubenswrapper[4565]: E1125 09:55:13.098702 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:55:26 crc kubenswrapper[4565]: I1125 09:55:26.098911 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:55:26 crc kubenswrapper[4565]: E1125 09:55:26.100057 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:55:37 crc kubenswrapper[4565]: I1125 09:55:37.104668 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:55:37 crc kubenswrapper[4565]: E1125 09:55:37.105550 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:55:51 crc kubenswrapper[4565]: I1125 09:55:51.098069 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:55:51 crc kubenswrapper[4565]: E1125 09:55:51.099007 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:56:04 crc kubenswrapper[4565]: I1125 09:56:04.097461 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:56:04 crc kubenswrapper[4565]: E1125 09:56:04.098563 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:56:18 crc kubenswrapper[4565]: I1125 09:56:18.097490 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:56:18 crc kubenswrapper[4565]: E1125 09:56:18.098622 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:56:30 crc kubenswrapper[4565]: I1125 09:56:30.098286 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:56:30 crc kubenswrapper[4565]: E1125 09:56:30.099550 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:56:44 crc kubenswrapper[4565]: I1125 09:56:44.097456 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:56:44 crc kubenswrapper[4565]: E1125 09:56:44.098431 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:56:55 crc kubenswrapper[4565]: I1125 09:56:55.098013 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:56:55 crc kubenswrapper[4565]: E1125 09:56:55.098883 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:57:09 crc kubenswrapper[4565]: I1125 09:57:09.099252 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:57:09 crc kubenswrapper[4565]: E1125 09:57:09.100326 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:57:24 crc kubenswrapper[4565]: I1125 09:57:24.097350 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:57:24 crc kubenswrapper[4565]: E1125 09:57:24.098389 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 09:57:37 crc kubenswrapper[4565]: I1125 09:57:37.104226 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 09:57:38 crc kubenswrapper[4565]: I1125 09:57:38.230189 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55"} Nov 25 09:58:35 crc kubenswrapper[4565]: I1125 09:58:35.124041 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-lcpln"] Nov 25 09:58:35 crc kubenswrapper[4565]: I1125 09:58:35.157561 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4214-account-create-ftd48"] Nov 25 09:58:35 crc kubenswrapper[4565]: I1125 09:58:35.176978 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-lcpln"] Nov 25 09:58:35 crc kubenswrapper[4565]: I1125 09:58:35.188871 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4214-account-create-ftd48"] Nov 25 09:58:37 crc kubenswrapper[4565]: I1125 09:58:37.106330 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427c19d8-2ea1-4831-8591-8b8d52eb83dd" path="/var/lib/kubelet/pods/427c19d8-2ea1-4831-8591-8b8d52eb83dd/volumes" Nov 25 09:58:37 crc kubenswrapper[4565]: I1125 09:58:37.108474 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc64fad-0ded-4d44-a53b-cdd2f5cebc45" path="/var/lib/kubelet/pods/6bc64fad-0ded-4d44-a53b-cdd2f5cebc45/volumes" Nov 25 09:58:37 crc kubenswrapper[4565]: I1125 09:58:37.432824 4565 scope.go:117] "RemoveContainer" containerID="b581befa176a1d323f49a2b4397a4188801e223001a1211519342157891532f1" Nov 25 09:58:37 crc kubenswrapper[4565]: I1125 09:58:37.456866 4565 scope.go:117] "RemoveContainer" containerID="781d16635c862d212f8d2bf238daeac708acc52a210abfc018f30477cee35469" Nov 25 09:58:58 crc kubenswrapper[4565]: I1125 09:58:58.046279 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-r5rpm"] Nov 25 09:58:58 crc kubenswrapper[4565]: I1125 09:58:58.055325 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-r5rpm"] Nov 25 09:58:59 crc kubenswrapper[4565]: I1125 09:58:59.110629 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b64f1fae-2be7-4ebd-a561-58884c10b4e6" path="/var/lib/kubelet/pods/b64f1fae-2be7-4ebd-a561-58884c10b4e6/volumes" Nov 25 09:59:37 crc kubenswrapper[4565]: I1125 09:59:37.547821 4565 scope.go:117] "RemoveContainer" containerID="fb5f5c3bda09faeafbea87526cb4f73e0fe6787730b9964d3940f0a4ee87791a" Nov 25 09:59:55 crc kubenswrapper[4565]: I1125 09:59:55.099393 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 09:59:55 crc kubenswrapper[4565]: I1125 09:59:55.100100 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.143381 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl"] Nov 25 10:00:00 crc kubenswrapper[4565]: E1125 10:00:00.145700 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="extract-utilities" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.145743 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="extract-utilities" Nov 25 10:00:00 crc kubenswrapper[4565]: E1125 10:00:00.145758 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="extract-content" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.145766 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="extract-content" Nov 25 10:00:00 crc kubenswrapper[4565]: E1125 10:00:00.145783 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="registry-server" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.145789 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="registry-server" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.146081 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c754c8e3-63d9-4702-a677-24a567e06bed" containerName="registry-server" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.147008 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.148773 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.149365 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.153250 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl"] Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.301253 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fh56\" (UniqueName: \"kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.301305 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.301342 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.404200 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fh56\" (UniqueName: \"kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.404252 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.404292 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.405178 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.410881 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.420587 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fh56\" (UniqueName: \"kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56\") pod \"collect-profiles-29401080-zxxfl\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.463538 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:00 crc kubenswrapper[4565]: I1125 10:00:00.894273 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl"] Nov 25 10:00:01 crc kubenswrapper[4565]: I1125 10:00:01.657677 4565 generic.go:334] "Generic (PLEG): container finished" podID="71684759-87d9-4506-8c2f-975c13071f7d" containerID="a625cd866a0b7b5f6a005d53d17c30b680e95ce90222c4f84d08ac0c8f20a4c7" exitCode=0 Nov 25 10:00:01 crc kubenswrapper[4565]: I1125 10:00:01.657763 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" event={"ID":"71684759-87d9-4506-8c2f-975c13071f7d","Type":"ContainerDied","Data":"a625cd866a0b7b5f6a005d53d17c30b680e95ce90222c4f84d08ac0c8f20a4c7"} Nov 25 10:00:01 crc kubenswrapper[4565]: I1125 10:00:01.658342 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" event={"ID":"71684759-87d9-4506-8c2f-975c13071f7d","Type":"ContainerStarted","Data":"dfcde90db0955a6036a7e9c633ef297211b551c9ca103fac1de15f916dfb925a"} Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.080082 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.085618 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume\") pod \"71684759-87d9-4506-8c2f-975c13071f7d\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.085675 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume\") pod \"71684759-87d9-4506-8c2f-975c13071f7d\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.085727 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fh56\" (UniqueName: \"kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56\") pod \"71684759-87d9-4506-8c2f-975c13071f7d\" (UID: \"71684759-87d9-4506-8c2f-975c13071f7d\") " Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.086588 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "71684759-87d9-4506-8c2f-975c13071f7d" (UID: "71684759-87d9-4506-8c2f-975c13071f7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.091597 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56" (OuterVolumeSpecName: "kube-api-access-9fh56") pod "71684759-87d9-4506-8c2f-975c13071f7d" (UID: "71684759-87d9-4506-8c2f-975c13071f7d"). InnerVolumeSpecName "kube-api-access-9fh56". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.091690 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71684759-87d9-4506-8c2f-975c13071f7d" (UID: "71684759-87d9-4506-8c2f-975c13071f7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.189704 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71684759-87d9-4506-8c2f-975c13071f7d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.189741 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71684759-87d9-4506-8c2f-975c13071f7d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.189752 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fh56\" (UniqueName: \"kubernetes.io/projected/71684759-87d9-4506-8c2f-975c13071f7d-kube-api-access-9fh56\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.679097 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" event={"ID":"71684759-87d9-4506-8c2f-975c13071f7d","Type":"ContainerDied","Data":"dfcde90db0955a6036a7e9c633ef297211b551c9ca103fac1de15f916dfb925a"} Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.679458 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfcde90db0955a6036a7e9c633ef297211b551c9ca103fac1de15f916dfb925a" Nov 25 10:00:03 crc kubenswrapper[4565]: I1125 10:00:03.679239 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401080-zxxfl" Nov 25 10:00:04 crc kubenswrapper[4565]: I1125 10:00:04.194697 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl"] Nov 25 10:00:04 crc kubenswrapper[4565]: I1125 10:00:04.206475 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401035-8dbdl"] Nov 25 10:00:05 crc kubenswrapper[4565]: I1125 10:00:05.108657 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c99be3-6d18-46c5-b8db-831beb1eb80d" path="/var/lib/kubelet/pods/62c99be3-6d18-46c5-b8db-831beb1eb80d/volumes" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.193824 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:08 crc kubenswrapper[4565]: E1125 10:00:08.194506 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71684759-87d9-4506-8c2f-975c13071f7d" containerName="collect-profiles" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.194520 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="71684759-87d9-4506-8c2f-975c13071f7d" containerName="collect-profiles" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.194710 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="71684759-87d9-4506-8c2f-975c13071f7d" containerName="collect-profiles" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.195905 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.231554 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.300566 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcmc\" (UniqueName: \"kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.300812 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.300886 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.405837 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcmc\" (UniqueName: \"kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.406408 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.406534 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.406876 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.406948 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.438896 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcmc\" (UniqueName: \"kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc\") pod \"community-operators-rvkhm\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:08 crc kubenswrapper[4565]: I1125 10:00:08.525497 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:09 crc kubenswrapper[4565]: I1125 10:00:09.115497 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:09 crc kubenswrapper[4565]: I1125 10:00:09.737725 4565 generic.go:334] "Generic (PLEG): container finished" podID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerID="febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a" exitCode=0 Nov 25 10:00:09 crc kubenswrapper[4565]: I1125 10:00:09.737831 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerDied","Data":"febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a"} Nov 25 10:00:09 crc kubenswrapper[4565]: I1125 10:00:09.738116 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerStarted","Data":"1d20306672ba2791c05972969614b10d5a6095e04ef0c3ac4577e9074a731903"} Nov 25 10:00:09 crc kubenswrapper[4565]: I1125 10:00:09.740307 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 10:00:11 crc kubenswrapper[4565]: I1125 10:00:11.758480 4565 generic.go:334] "Generic (PLEG): container finished" podID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerID="4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a" exitCode=0 Nov 25 10:00:11 crc kubenswrapper[4565]: I1125 10:00:11.758652 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerDied","Data":"4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a"} Nov 25 10:00:12 crc kubenswrapper[4565]: I1125 10:00:12.771660 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerStarted","Data":"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f"} Nov 25 10:00:12 crc kubenswrapper[4565]: I1125 10:00:12.809250 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvkhm" podStartSLOduration=2.228488911 podStartE2EDuration="4.8092236s" podCreationTimestamp="2025-11-25 10:00:08 +0000 UTC" firstStartedPulling="2025-11-25 10:00:09.739970782 +0000 UTC m=+3342.942465920" lastFinishedPulling="2025-11-25 10:00:12.320705472 +0000 UTC m=+3345.523200609" observedRunningTime="2025-11-25 10:00:12.790289947 +0000 UTC m=+3345.992785075" watchObservedRunningTime="2025-11-25 10:00:12.8092236 +0000 UTC m=+3346.011718729" Nov 25 10:00:18 crc kubenswrapper[4565]: I1125 10:00:18.526150 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:18 crc kubenswrapper[4565]: I1125 10:00:18.527030 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:18 crc kubenswrapper[4565]: I1125 10:00:18.566633 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:18 crc kubenswrapper[4565]: I1125 10:00:18.865998 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:18 crc kubenswrapper[4565]: I1125 10:00:18.921270 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:20 crc kubenswrapper[4565]: I1125 10:00:20.846092 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rvkhm" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="registry-server" containerID="cri-o://c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f" gracePeriod=2 Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.400913 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.424684 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wcmc\" (UniqueName: \"kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc\") pod \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.424870 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content\") pod \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.425053 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities\") pod \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\" (UID: \"17bdae12-18f2-450c-9df4-37ab28cfe0a7\") " Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.425772 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities" (OuterVolumeSpecName: "utilities") pod "17bdae12-18f2-450c-9df4-37ab28cfe0a7" (UID: "17bdae12-18f2-450c-9df4-37ab28cfe0a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.426226 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.436864 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc" (OuterVolumeSpecName: "kube-api-access-7wcmc") pod "17bdae12-18f2-450c-9df4-37ab28cfe0a7" (UID: "17bdae12-18f2-450c-9df4-37ab28cfe0a7"). InnerVolumeSpecName "kube-api-access-7wcmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.484748 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17bdae12-18f2-450c-9df4-37ab28cfe0a7" (UID: "17bdae12-18f2-450c-9df4-37ab28cfe0a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.529305 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wcmc\" (UniqueName: \"kubernetes.io/projected/17bdae12-18f2-450c-9df4-37ab28cfe0a7-kube-api-access-7wcmc\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.529343 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bdae12-18f2-450c-9df4-37ab28cfe0a7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.875823 4565 generic.go:334] "Generic (PLEG): container finished" podID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerID="c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f" exitCode=0 Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.875917 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerDied","Data":"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f"} Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.875989 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvkhm" event={"ID":"17bdae12-18f2-450c-9df4-37ab28cfe0a7","Type":"ContainerDied","Data":"1d20306672ba2791c05972969614b10d5a6095e04ef0c3ac4577e9074a731903"} Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.876018 4565 scope.go:117] "RemoveContainer" containerID="c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.876377 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvkhm" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.903703 4565 scope.go:117] "RemoveContainer" containerID="4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.919584 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.929272 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rvkhm"] Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.938241 4565 scope.go:117] "RemoveContainer" containerID="febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.965546 4565 scope.go:117] "RemoveContainer" containerID="c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f" Nov 25 10:00:21 crc kubenswrapper[4565]: E1125 10:00:21.965891 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f\": container with ID starting with c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f not found: ID does not exist" containerID="c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.966018 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f"} err="failed to get container status \"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f\": rpc error: code = NotFound desc = could not find container \"c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f\": container with ID starting with c4ebe84e71742231a52bf248c858f4017eb8c802267ec95ca9ce3ab0e75bde4f not found: ID does not exist" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.966094 4565 scope.go:117] "RemoveContainer" containerID="4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a" Nov 25 10:00:21 crc kubenswrapper[4565]: E1125 10:00:21.966384 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a\": container with ID starting with 4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a not found: ID does not exist" containerID="4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.966458 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a"} err="failed to get container status \"4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a\": rpc error: code = NotFound desc = could not find container \"4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a\": container with ID starting with 4d520f2f4468fbc1bd1b6a4322a272b6ea09d2bea228750c4f70d367719d1a3a not found: ID does not exist" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.966517 4565 scope.go:117] "RemoveContainer" containerID="febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a" Nov 25 10:00:21 crc kubenswrapper[4565]: E1125 10:00:21.966739 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a\": container with ID starting with febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a not found: ID does not exist" containerID="febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a" Nov 25 10:00:21 crc kubenswrapper[4565]: I1125 10:00:21.966857 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a"} err="failed to get container status \"febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a\": rpc error: code = NotFound desc = could not find container \"febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a\": container with ID starting with febf0dda8cf7d948ee92c8bf960075a2e725c2918434a60b3d6e2bf8d0b86d8a not found: ID does not exist" Nov 25 10:00:23 crc kubenswrapper[4565]: I1125 10:00:23.105969 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" path="/var/lib/kubelet/pods/17bdae12-18f2-450c-9df4-37ab28cfe0a7/volumes" Nov 25 10:00:25 crc kubenswrapper[4565]: I1125 10:00:25.099103 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:00:25 crc kubenswrapper[4565]: I1125 10:00:25.099405 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:00:37 crc kubenswrapper[4565]: I1125 10:00:37.613292 4565 scope.go:117] "RemoveContainer" containerID="58d9e06d96c969f0322a4ee8258fda4dc74a72519d33495ce748c1c634fe025d" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.506940 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:00:49 crc kubenswrapper[4565]: E1125 10:00:49.508170 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="extract-utilities" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.508196 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="extract-utilities" Nov 25 10:00:49 crc kubenswrapper[4565]: E1125 10:00:49.508214 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="registry-server" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.508220 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="registry-server" Nov 25 10:00:49 crc kubenswrapper[4565]: E1125 10:00:49.508239 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="extract-content" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.508245 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="extract-content" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.508492 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bdae12-18f2-450c-9df4-37ab28cfe0a7" containerName="registry-server" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.510179 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.526647 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.547084 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlq98\" (UniqueName: \"kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.547260 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.547628 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.649177 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.649300 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlq98\" (UniqueName: \"kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.649352 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.649896 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.649953 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.679997 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlq98\" (UniqueName: \"kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98\") pod \"redhat-marketplace-2728v\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:49 crc kubenswrapper[4565]: I1125 10:00:49.832775 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:50 crc kubenswrapper[4565]: I1125 10:00:50.310325 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:00:51 crc kubenswrapper[4565]: I1125 10:00:51.147701 4565 generic.go:334] "Generic (PLEG): container finished" podID="eb7746ef-afde-4835-bdca-c453054d2184" containerID="3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97" exitCode=0 Nov 25 10:00:51 crc kubenswrapper[4565]: I1125 10:00:51.147845 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerDied","Data":"3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97"} Nov 25 10:00:51 crc kubenswrapper[4565]: I1125 10:00:51.148305 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerStarted","Data":"2635df3f8453202657f1b126f412da3280b1d12c4827d4dafd75b8b658775527"} Nov 25 10:00:53 crc kubenswrapper[4565]: I1125 10:00:53.167485 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerStarted","Data":"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca"} Nov 25 10:00:54 crc kubenswrapper[4565]: I1125 10:00:54.178128 4565 generic.go:334] "Generic (PLEG): container finished" podID="eb7746ef-afde-4835-bdca-c453054d2184" containerID="20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca" exitCode=0 Nov 25 10:00:54 crc kubenswrapper[4565]: I1125 10:00:54.178200 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerDied","Data":"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca"} Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.099594 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.099993 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.108579 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.109008 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.109084 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55" gracePeriod=600 Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.192332 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerStarted","Data":"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6"} Nov 25 10:00:55 crc kubenswrapper[4565]: I1125 10:00:55.223415 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2728v" podStartSLOduration=2.698155827 podStartE2EDuration="6.223398704s" podCreationTimestamp="2025-11-25 10:00:49 +0000 UTC" firstStartedPulling="2025-11-25 10:00:51.149105917 +0000 UTC m=+3384.351601055" lastFinishedPulling="2025-11-25 10:00:54.674348794 +0000 UTC m=+3387.876843932" observedRunningTime="2025-11-25 10:00:55.209696814 +0000 UTC m=+3388.412191953" watchObservedRunningTime="2025-11-25 10:00:55.223398704 +0000 UTC m=+3388.425893842" Nov 25 10:00:56 crc kubenswrapper[4565]: I1125 10:00:56.206325 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55" exitCode=0 Nov 25 10:00:56 crc kubenswrapper[4565]: I1125 10:00:56.206378 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55"} Nov 25 10:00:56 crc kubenswrapper[4565]: I1125 10:00:56.207138 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c"} Nov 25 10:00:56 crc kubenswrapper[4565]: I1125 10:00:56.207160 4565 scope.go:117] "RemoveContainer" containerID="a8b4ac6102386f2198b1b86c48f884a65d45746deb4cbe23b51a0dec27843cca" Nov 25 10:00:59 crc kubenswrapper[4565]: I1125 10:00:59.833202 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:59 crc kubenswrapper[4565]: I1125 10:00:59.833791 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:00:59 crc kubenswrapper[4565]: I1125 10:00:59.879250 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.147805 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29401081-lh62g"] Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.149503 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.166271 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401081-lh62g"] Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.287305 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.318075 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.318283 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.318707 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.318881 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.335319 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.421606 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.421712 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.421838 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.421872 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.428655 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.431149 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.434724 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.438353 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9\") pod \"keystone-cron-29401081-lh62g\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:00 crc kubenswrapper[4565]: I1125 10:01:00.475495 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:01 crc kubenswrapper[4565]: W1125 10:01:01.067628 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3749c5c9_d117_42da_bd30_bb593c5d1fb2.slice/crio-54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5 WatchSource:0}: Error finding container 54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5: Status 404 returned error can't find the container with id 54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5 Nov 25 10:01:01 crc kubenswrapper[4565]: I1125 10:01:01.071577 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401081-lh62g"] Nov 25 10:01:01 crc kubenswrapper[4565]: I1125 10:01:01.265059 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401081-lh62g" event={"ID":"3749c5c9-d117-42da-bd30-bb593c5d1fb2","Type":"ContainerStarted","Data":"b9152e463c41fc527d0e5e278198cdfdeef9efc6dabb7ccf4a23e95b4981bddd"} Nov 25 10:01:01 crc kubenswrapper[4565]: I1125 10:01:01.265544 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401081-lh62g" event={"ID":"3749c5c9-d117-42da-bd30-bb593c5d1fb2","Type":"ContainerStarted","Data":"54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5"} Nov 25 10:01:01 crc kubenswrapper[4565]: I1125 10:01:01.289002 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29401081-lh62g" podStartSLOduration=1.288973735 podStartE2EDuration="1.288973735s" podCreationTimestamp="2025-11-25 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 10:01:01.286126558 +0000 UTC m=+3394.488621716" watchObservedRunningTime="2025-11-25 10:01:01.288973735 +0000 UTC m=+3394.491468874" Nov 25 10:01:02 crc kubenswrapper[4565]: I1125 10:01:02.273238 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2728v" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="registry-server" containerID="cri-o://a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6" gracePeriod=2 Nov 25 10:01:02 crc kubenswrapper[4565]: I1125 10:01:02.828412 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.000609 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlq98\" (UniqueName: \"kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98\") pod \"eb7746ef-afde-4835-bdca-c453054d2184\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.000892 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content\") pod \"eb7746ef-afde-4835-bdca-c453054d2184\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.001049 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities\") pod \"eb7746ef-afde-4835-bdca-c453054d2184\" (UID: \"eb7746ef-afde-4835-bdca-c453054d2184\") " Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.001730 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities" (OuterVolumeSpecName: "utilities") pod "eb7746ef-afde-4835-bdca-c453054d2184" (UID: "eb7746ef-afde-4835-bdca-c453054d2184"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.006255 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98" (OuterVolumeSpecName: "kube-api-access-qlq98") pod "eb7746ef-afde-4835-bdca-c453054d2184" (UID: "eb7746ef-afde-4835-bdca-c453054d2184"). InnerVolumeSpecName "kube-api-access-qlq98". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.030416 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb7746ef-afde-4835-bdca-c453054d2184" (UID: "eb7746ef-afde-4835-bdca-c453054d2184"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.104405 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlq98\" (UniqueName: \"kubernetes.io/projected/eb7746ef-afde-4835-bdca-c453054d2184-kube-api-access-qlq98\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.104546 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.104615 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7746ef-afde-4835-bdca-c453054d2184-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.287206 4565 generic.go:334] "Generic (PLEG): container finished" podID="eb7746ef-afde-4835-bdca-c453054d2184" containerID="a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6" exitCode=0 Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.287266 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerDied","Data":"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6"} Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.287290 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2728v" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.287318 4565 scope.go:117] "RemoveContainer" containerID="a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.287303 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2728v" event={"ID":"eb7746ef-afde-4835-bdca-c453054d2184","Type":"ContainerDied","Data":"2635df3f8453202657f1b126f412da3280b1d12c4827d4dafd75b8b658775527"} Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.316799 4565 scope.go:117] "RemoveContainer" containerID="20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.319093 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.339847 4565 scope.go:117] "RemoveContainer" containerID="3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.341887 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2728v"] Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.369891 4565 scope.go:117] "RemoveContainer" containerID="a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6" Nov 25 10:01:03 crc kubenswrapper[4565]: E1125 10:01:03.370327 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6\": container with ID starting with a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6 not found: ID does not exist" containerID="a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.370458 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6"} err="failed to get container status \"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6\": rpc error: code = NotFound desc = could not find container \"a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6\": container with ID starting with a2dd77d845b5463c6b08e43d379ca4499ce3692db9d13e590afcdd0d456b55a6 not found: ID does not exist" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.370540 4565 scope.go:117] "RemoveContainer" containerID="20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca" Nov 25 10:01:03 crc kubenswrapper[4565]: E1125 10:01:03.370898 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca\": container with ID starting with 20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca not found: ID does not exist" containerID="20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.370944 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca"} err="failed to get container status \"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca\": rpc error: code = NotFound desc = could not find container \"20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca\": container with ID starting with 20057a755b6f5317caba3a9cf1d62bf1263293eb22e7af6dee78352c90ad5aca not found: ID does not exist" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.370966 4565 scope.go:117] "RemoveContainer" containerID="3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97" Nov 25 10:01:03 crc kubenswrapper[4565]: E1125 10:01:03.371175 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97\": container with ID starting with 3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97 not found: ID does not exist" containerID="3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97" Nov 25 10:01:03 crc kubenswrapper[4565]: I1125 10:01:03.371206 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97"} err="failed to get container status \"3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97\": rpc error: code = NotFound desc = could not find container \"3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97\": container with ID starting with 3f5ddbdda1cd7969aa13c34db753b170d923a66d43eb607639cca098dc212d97 not found: ID does not exist" Nov 25 10:01:04 crc kubenswrapper[4565]: I1125 10:01:04.303610 4565 generic.go:334] "Generic (PLEG): container finished" podID="3749c5c9-d117-42da-bd30-bb593c5d1fb2" containerID="b9152e463c41fc527d0e5e278198cdfdeef9efc6dabb7ccf4a23e95b4981bddd" exitCode=0 Nov 25 10:01:04 crc kubenswrapper[4565]: I1125 10:01:04.303673 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401081-lh62g" event={"ID":"3749c5c9-d117-42da-bd30-bb593c5d1fb2","Type":"ContainerDied","Data":"b9152e463c41fc527d0e5e278198cdfdeef9efc6dabb7ccf4a23e95b4981bddd"} Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.141813 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7746ef-afde-4835-bdca-c453054d2184" path="/var/lib/kubelet/pods/eb7746ef-afde-4835-bdca-c453054d2184/volumes" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.728263 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.885669 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9\") pod \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.886427 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys\") pod \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.886549 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data\") pod \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.886793 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle\") pod \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\" (UID: \"3749c5c9-d117-42da-bd30-bb593c5d1fb2\") " Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.907653 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3749c5c9-d117-42da-bd30-bb593c5d1fb2" (UID: "3749c5c9-d117-42da-bd30-bb593c5d1fb2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.909997 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9" (OuterVolumeSpecName: "kube-api-access-hgzf9") pod "3749c5c9-d117-42da-bd30-bb593c5d1fb2" (UID: "3749c5c9-d117-42da-bd30-bb593c5d1fb2"). InnerVolumeSpecName "kube-api-access-hgzf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.913310 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3749c5c9-d117-42da-bd30-bb593c5d1fb2" (UID: "3749c5c9-d117-42da-bd30-bb593c5d1fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.946441 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data" (OuterVolumeSpecName: "config-data") pod "3749c5c9-d117-42da-bd30-bb593c5d1fb2" (UID: "3749c5c9-d117-42da-bd30-bb593c5d1fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.992100 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/3749c5c9-d117-42da-bd30-bb593c5d1fb2-kube-api-access-hgzf9\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.992139 4565 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.992154 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:05 crc kubenswrapper[4565]: I1125 10:01:05.992164 4565 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3749c5c9-d117-42da-bd30-bb593c5d1fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 10:01:06 crc kubenswrapper[4565]: I1125 10:01:06.324077 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401081-lh62g" event={"ID":"3749c5c9-d117-42da-bd30-bb593c5d1fb2","Type":"ContainerDied","Data":"54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5"} Nov 25 10:01:06 crc kubenswrapper[4565]: I1125 10:01:06.324598 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d770048cfd8740bcd54513d407bebe411a5f36e8fe6a28e63bb86c420640f5" Nov 25 10:01:06 crc kubenswrapper[4565]: I1125 10:01:06.324153 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401081-lh62g" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.966884 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:01:46 crc kubenswrapper[4565]: E1125 10:01:46.968085 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3749c5c9-d117-42da-bd30-bb593c5d1fb2" containerName="keystone-cron" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968102 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3749c5c9-d117-42da-bd30-bb593c5d1fb2" containerName="keystone-cron" Nov 25 10:01:46 crc kubenswrapper[4565]: E1125 10:01:46.968134 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="extract-utilities" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968140 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="extract-utilities" Nov 25 10:01:46 crc kubenswrapper[4565]: E1125 10:01:46.968172 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="registry-server" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968178 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="registry-server" Nov 25 10:01:46 crc kubenswrapper[4565]: E1125 10:01:46.968199 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="extract-content" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968205 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="extract-content" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968477 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="3749c5c9-d117-42da-bd30-bb593c5d1fb2" containerName="keystone-cron" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.968489 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7746ef-afde-4835-bdca-c453054d2184" containerName="registry-server" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.970240 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.991261 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.995483 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.995639 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:46 crc kubenswrapper[4565]: I1125 10:01:46.995728 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbvq\" (UniqueName: \"kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.098163 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.098220 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.098260 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbvq\" (UniqueName: \"kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.098732 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.098730 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.117671 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbvq\" (UniqueName: \"kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq\") pod \"redhat-operators-46drf\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.292800 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:47 crc kubenswrapper[4565]: I1125 10:01:47.737490 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:01:47 crc kubenswrapper[4565]: W1125 10:01:47.758447 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501873a7_8de6_4677_840d_ea75b1630d45.slice/crio-4594a44fcd9b7f86a30e6eaa6cf50a4ac89103a3ac9bd94914313fc961d58299 WatchSource:0}: Error finding container 4594a44fcd9b7f86a30e6eaa6cf50a4ac89103a3ac9bd94914313fc961d58299: Status 404 returned error can't find the container with id 4594a44fcd9b7f86a30e6eaa6cf50a4ac89103a3ac9bd94914313fc961d58299 Nov 25 10:01:48 crc kubenswrapper[4565]: I1125 10:01:48.714409 4565 generic.go:334] "Generic (PLEG): container finished" podID="501873a7-8de6-4677-840d-ea75b1630d45" containerID="a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706" exitCode=0 Nov 25 10:01:48 crc kubenswrapper[4565]: I1125 10:01:48.714504 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerDied","Data":"a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706"} Nov 25 10:01:48 crc kubenswrapper[4565]: I1125 10:01:48.715278 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerStarted","Data":"4594a44fcd9b7f86a30e6eaa6cf50a4ac89103a3ac9bd94914313fc961d58299"} Nov 25 10:01:49 crc kubenswrapper[4565]: I1125 10:01:49.726789 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerStarted","Data":"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3"} Nov 25 10:01:51 crc kubenswrapper[4565]: I1125 10:01:51.750021 4565 generic.go:334] "Generic (PLEG): container finished" podID="501873a7-8de6-4677-840d-ea75b1630d45" containerID="eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3" exitCode=0 Nov 25 10:01:51 crc kubenswrapper[4565]: I1125 10:01:51.750117 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerDied","Data":"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3"} Nov 25 10:01:52 crc kubenswrapper[4565]: I1125 10:01:52.764104 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerStarted","Data":"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5"} Nov 25 10:01:52 crc kubenswrapper[4565]: I1125 10:01:52.789708 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-46drf" podStartSLOduration=3.194023294 podStartE2EDuration="6.789668166s" podCreationTimestamp="2025-11-25 10:01:46 +0000 UTC" firstStartedPulling="2025-11-25 10:01:48.717244806 +0000 UTC m=+3441.919739944" lastFinishedPulling="2025-11-25 10:01:52.312889678 +0000 UTC m=+3445.515384816" observedRunningTime="2025-11-25 10:01:52.782546591 +0000 UTC m=+3445.985041730" watchObservedRunningTime="2025-11-25 10:01:52.789668166 +0000 UTC m=+3445.992163304" Nov 25 10:01:57 crc kubenswrapper[4565]: I1125 10:01:57.293545 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:57 crc kubenswrapper[4565]: I1125 10:01:57.294116 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:01:58 crc kubenswrapper[4565]: I1125 10:01:58.383507 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-46drf" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="registry-server" probeResult="failure" output=< Nov 25 10:01:58 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 10:01:58 crc kubenswrapper[4565]: > Nov 25 10:02:07 crc kubenswrapper[4565]: I1125 10:02:07.332352 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:02:07 crc kubenswrapper[4565]: I1125 10:02:07.401203 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:02:07 crc kubenswrapper[4565]: I1125 10:02:07.573498 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:02:08 crc kubenswrapper[4565]: I1125 10:02:08.918289 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-46drf" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="registry-server" containerID="cri-o://46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5" gracePeriod=2 Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.411222 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.520229 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities\") pod \"501873a7-8de6-4677-840d-ea75b1630d45\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.520304 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content\") pod \"501873a7-8de6-4677-840d-ea75b1630d45\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.520555 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gbvq\" (UniqueName: \"kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq\") pod \"501873a7-8de6-4677-840d-ea75b1630d45\" (UID: \"501873a7-8de6-4677-840d-ea75b1630d45\") " Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.521141 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities" (OuterVolumeSpecName: "utilities") pod "501873a7-8de6-4677-840d-ea75b1630d45" (UID: "501873a7-8de6-4677-840d-ea75b1630d45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.527804 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq" (OuterVolumeSpecName: "kube-api-access-5gbvq") pod "501873a7-8de6-4677-840d-ea75b1630d45" (UID: "501873a7-8de6-4677-840d-ea75b1630d45"). InnerVolumeSpecName "kube-api-access-5gbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.600296 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "501873a7-8de6-4677-840d-ea75b1630d45" (UID: "501873a7-8de6-4677-840d-ea75b1630d45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.623040 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gbvq\" (UniqueName: \"kubernetes.io/projected/501873a7-8de6-4677-840d-ea75b1630d45-kube-api-access-5gbvq\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.623077 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.623089 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501873a7-8de6-4677-840d-ea75b1630d45-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.929762 4565 generic.go:334] "Generic (PLEG): container finished" podID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" containerID="44d0883560c7569ff51bdc3f478ac620a5610e14f048dd56ba5ddd8a0b24fd52" exitCode=0 Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.929856 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2","Type":"ContainerDied","Data":"44d0883560c7569ff51bdc3f478ac620a5610e14f048dd56ba5ddd8a0b24fd52"} Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.933508 4565 generic.go:334] "Generic (PLEG): container finished" podID="501873a7-8de6-4677-840d-ea75b1630d45" containerID="46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5" exitCode=0 Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.933567 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerDied","Data":"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5"} Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.933601 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46drf" event={"ID":"501873a7-8de6-4677-840d-ea75b1630d45","Type":"ContainerDied","Data":"4594a44fcd9b7f86a30e6eaa6cf50a4ac89103a3ac9bd94914313fc961d58299"} Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.933633 4565 scope.go:117] "RemoveContainer" containerID="46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.933670 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46drf" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.961656 4565 scope.go:117] "RemoveContainer" containerID="eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3" Nov 25 10:02:09 crc kubenswrapper[4565]: I1125 10:02:09.987032 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.001910 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-46drf"] Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.006011 4565 scope.go:117] "RemoveContainer" containerID="a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.027979 4565 scope.go:117] "RemoveContainer" containerID="46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5" Nov 25 10:02:10 crc kubenswrapper[4565]: E1125 10:02:10.028410 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5\": container with ID starting with 46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5 not found: ID does not exist" containerID="46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.028450 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5"} err="failed to get container status \"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5\": rpc error: code = NotFound desc = could not find container \"46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5\": container with ID starting with 46a629f45f1904088c12ae679bd4853f7cd1f1f2e3d04dd619f5d6ac1fff27e5 not found: ID does not exist" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.028481 4565 scope.go:117] "RemoveContainer" containerID="eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3" Nov 25 10:02:10 crc kubenswrapper[4565]: E1125 10:02:10.028786 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3\": container with ID starting with eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3 not found: ID does not exist" containerID="eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.028872 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3"} err="failed to get container status \"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3\": rpc error: code = NotFound desc = could not find container \"eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3\": container with ID starting with eafe1e3b52256e4e263f8bb3c49dd8ddf6e70cc03fd649f35a71cb39beeac6b3 not found: ID does not exist" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.028898 4565 scope.go:117] "RemoveContainer" containerID="a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706" Nov 25 10:02:10 crc kubenswrapper[4565]: E1125 10:02:10.032009 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706\": container with ID starting with a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706 not found: ID does not exist" containerID="a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706" Nov 25 10:02:10 crc kubenswrapper[4565]: I1125 10:02:10.032057 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706"} err="failed to get container status \"a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706\": rpc error: code = NotFound desc = could not find container \"a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706\": container with ID starting with a4b7c1d0b1d88b9930bc9149fdf71d11ee1f7ed4b7719d6c866d0c6772990706 not found: ID does not exist" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.110696 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501873a7-8de6-4677-840d-ea75b1630d45" path="/var/lib/kubelet/pods/501873a7-8de6-4677-840d-ea75b1630d45/volumes" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.378073 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.464893 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465134 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465298 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465425 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6gbs\" (UniqueName: \"kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465692 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465763 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data" (OuterVolumeSpecName: "config-data") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.465874 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.466294 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.466407 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.466481 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config\") pod \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\" (UID: \"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2\") " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.467956 4565 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.466293 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.470463 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.472709 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.472861 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs" (OuterVolumeSpecName: "kube-api-access-j6gbs") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "kube-api-access-j6gbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.500031 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.500143 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.500324 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.512819 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" (UID: "a3071a8a-a30b-4b2b-aea0-5882f4eff1b2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572222 4565 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572263 4565 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572277 4565 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572294 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572306 4565 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572357 4565 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572368 4565 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.572381 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6gbs\" (UniqueName: \"kubernetes.io/projected/a3071a8a-a30b-4b2b-aea0-5882f4eff1b2-kube-api-access-j6gbs\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.612496 4565 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.674297 4565 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.953092 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a3071a8a-a30b-4b2b-aea0-5882f4eff1b2","Type":"ContainerDied","Data":"3c39a243e4a2d0f71c835739bdb2af34b848e9e77320c5b9790702e7708926ac"} Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.953145 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c39a243e4a2d0f71c835739bdb2af34b848e9e77320c5b9790702e7708926ac" Nov 25 10:02:11 crc kubenswrapper[4565]: I1125 10:02:11.953274 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.699553 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 10:02:13 crc kubenswrapper[4565]: E1125 10:02:13.700636 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="registry-server" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.700655 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="registry-server" Nov 25 10:02:13 crc kubenswrapper[4565]: E1125 10:02:13.700679 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="extract-content" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.700686 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="extract-content" Nov 25 10:02:13 crc kubenswrapper[4565]: E1125 10:02:13.700732 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" containerName="tempest-tests-tempest-tests-runner" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.700739 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" containerName="tempest-tests-tempest-tests-runner" Nov 25 10:02:13 crc kubenswrapper[4565]: E1125 10:02:13.700760 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="extract-utilities" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.700766 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="extract-utilities" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.701029 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="501873a7-8de6-4677-840d-ea75b1630d45" containerName="registry-server" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.701055 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3071a8a-a30b-4b2b-aea0-5882f4eff1b2" containerName="tempest-tests-tempest-tests-runner" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.702141 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.704619 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2kmjl" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.726115 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.810797 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.810970 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72lj\" (UniqueName: \"kubernetes.io/projected/1a5fdee8-1424-4227-a297-3c68d5463280-kube-api-access-g72lj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.914248 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.914665 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g72lj\" (UniqueName: \"kubernetes.io/projected/1a5fdee8-1424-4227-a297-3c68d5463280-kube-api-access-g72lj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.914753 4565 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.932783 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72lj\" (UniqueName: \"kubernetes.io/projected/1a5fdee8-1424-4227-a297-3c68d5463280-kube-api-access-g72lj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:13 crc kubenswrapper[4565]: I1125 10:02:13.946461 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a5fdee8-1424-4227-a297-3c68d5463280\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:14 crc kubenswrapper[4565]: I1125 10:02:14.031732 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 10:02:14 crc kubenswrapper[4565]: I1125 10:02:14.454643 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 10:02:14 crc kubenswrapper[4565]: I1125 10:02:14.981227 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1a5fdee8-1424-4227-a297-3c68d5463280","Type":"ContainerStarted","Data":"e68ff5b7f38d21bff9560c86ca001c939e3268ca22ff1f380ec8de3e5cb153a2"} Nov 25 10:02:15 crc kubenswrapper[4565]: I1125 10:02:15.991032 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1a5fdee8-1424-4227-a297-3c68d5463280","Type":"ContainerStarted","Data":"d23400703420872ea17f0b7190c19719227b2e5ef3f19aafb52dccb42853100b"} Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.281335 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=21.431708502 podStartE2EDuration="22.281316407s" podCreationTimestamp="2025-11-25 10:02:13 +0000 UTC" firstStartedPulling="2025-11-25 10:02:14.4686639 +0000 UTC m=+3467.671159038" lastFinishedPulling="2025-11-25 10:02:15.318271805 +0000 UTC m=+3468.520766943" observedRunningTime="2025-11-25 10:02:16.004159168 +0000 UTC m=+3469.206654307" watchObservedRunningTime="2025-11-25 10:02:35.281316407 +0000 UTC m=+3488.483811546" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.283481 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qx9jc/must-gather-qvhhg"] Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.284984 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.288204 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qx9jc"/"default-dockercfg-k972k" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.288567 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qx9jc"/"openshift-service-ca.crt" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.288615 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qx9jc"/"kube-root-ca.crt" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.303067 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qx9jc/must-gather-qvhhg"] Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.393621 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.393915 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.497383 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.497468 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.498046 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.514877 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4\") pod \"must-gather-qvhhg\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:35 crc kubenswrapper[4565]: I1125 10:02:35.602352 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:02:36 crc kubenswrapper[4565]: I1125 10:02:36.061102 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qx9jc/must-gather-qvhhg"] Nov 25 10:02:36 crc kubenswrapper[4565]: I1125 10:02:36.180968 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" event={"ID":"ab256124-fb53-42eb-b130-47c81598e7b9","Type":"ContainerStarted","Data":"d885e90a25f62339a4c9aebca46b8dc855e0153dce343abf3c80c2199883c6b7"} Nov 25 10:02:41 crc kubenswrapper[4565]: I1125 10:02:41.239101 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" event={"ID":"ab256124-fb53-42eb-b130-47c81598e7b9","Type":"ContainerStarted","Data":"e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05"} Nov 25 10:02:41 crc kubenswrapper[4565]: I1125 10:02:41.239873 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" event={"ID":"ab256124-fb53-42eb-b130-47c81598e7b9","Type":"ContainerStarted","Data":"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71"} Nov 25 10:02:41 crc kubenswrapper[4565]: I1125 10:02:41.266653 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" podStartSLOduration=2.262170716 podStartE2EDuration="6.266639958s" podCreationTimestamp="2025-11-25 10:02:35 +0000 UTC" firstStartedPulling="2025-11-25 10:02:36.057751177 +0000 UTC m=+3489.260246315" lastFinishedPulling="2025-11-25 10:02:40.062220418 +0000 UTC m=+3493.264715557" observedRunningTime="2025-11-25 10:02:41.256200579 +0000 UTC m=+3494.458695717" watchObservedRunningTime="2025-11-25 10:02:41.266639958 +0000 UTC m=+3494.469135095" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.161085 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-qzdb8"] Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.163492 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.352156 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2qds\" (UniqueName: \"kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.352779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.455683 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2qds\" (UniqueName: \"kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.455764 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.456180 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.475723 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2qds\" (UniqueName: \"kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds\") pod \"crc-debug-qzdb8\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:45 crc kubenswrapper[4565]: I1125 10:02:45.480940 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:02:46 crc kubenswrapper[4565]: I1125 10:02:46.295250 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" event={"ID":"63b1b7b6-0432-4292-aafa-ed7ee32210ed","Type":"ContainerStarted","Data":"9b3e5117eba0b67ece7b4f91bf579c4233d461347eaa97d7478b9d4b2876579c"} Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.391805 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.394285 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.400729 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.558600 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcmmw\" (UniqueName: \"kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.558693 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.558783 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.660724 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.660850 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.660902 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcmmw\" (UniqueName: \"kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.661227 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.661405 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.677518 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcmmw\" (UniqueName: \"kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw\") pod \"certified-operators-l65qw\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:53 crc kubenswrapper[4565]: I1125 10:02:53.726854 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:02:55 crc kubenswrapper[4565]: I1125 10:02:55.100144 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:02:55 crc kubenswrapper[4565]: I1125 10:02:55.100505 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:02:56 crc kubenswrapper[4565]: I1125 10:02:56.781817 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:02:57 crc kubenswrapper[4565]: I1125 10:02:57.429909 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" event={"ID":"63b1b7b6-0432-4292-aafa-ed7ee32210ed","Type":"ContainerStarted","Data":"b48759ab1a282ec541542b69dd8233a65a0610437c3216c5df05b40ae3a8c076"} Nov 25 10:02:57 crc kubenswrapper[4565]: I1125 10:02:57.431783 4565 generic.go:334] "Generic (PLEG): container finished" podID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerID="c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49" exitCode=0 Nov 25 10:02:57 crc kubenswrapper[4565]: I1125 10:02:57.431809 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerDied","Data":"c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49"} Nov 25 10:02:57 crc kubenswrapper[4565]: I1125 10:02:57.431841 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerStarted","Data":"8039d87f939ac292e91587c5364b05dcbf98e982d7bcea8ea2be7fe9a7dbaeb1"} Nov 25 10:02:57 crc kubenswrapper[4565]: I1125 10:02:57.450757 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" podStartSLOduration=1.594138125 podStartE2EDuration="12.450736062s" podCreationTimestamp="2025-11-25 10:02:45 +0000 UTC" firstStartedPulling="2025-11-25 10:02:45.513087454 +0000 UTC m=+3498.715582592" lastFinishedPulling="2025-11-25 10:02:56.369685391 +0000 UTC m=+3509.572180529" observedRunningTime="2025-11-25 10:02:57.441776221 +0000 UTC m=+3510.644271360" watchObservedRunningTime="2025-11-25 10:02:57.450736062 +0000 UTC m=+3510.653231199" Nov 25 10:02:58 crc kubenswrapper[4565]: I1125 10:02:58.443828 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerStarted","Data":"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76"} Nov 25 10:02:59 crc kubenswrapper[4565]: I1125 10:02:59.454581 4565 generic.go:334] "Generic (PLEG): container finished" podID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerID="33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76" exitCode=0 Nov 25 10:02:59 crc kubenswrapper[4565]: I1125 10:02:59.454752 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerDied","Data":"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76"} Nov 25 10:03:00 crc kubenswrapper[4565]: I1125 10:03:00.466054 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerStarted","Data":"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c"} Nov 25 10:03:00 crc kubenswrapper[4565]: I1125 10:03:00.485596 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l65qw" podStartSLOduration=4.970032529 podStartE2EDuration="7.485577844s" podCreationTimestamp="2025-11-25 10:02:53 +0000 UTC" firstStartedPulling="2025-11-25 10:02:57.434039638 +0000 UTC m=+3510.636534776" lastFinishedPulling="2025-11-25 10:02:59.949584953 +0000 UTC m=+3513.152080091" observedRunningTime="2025-11-25 10:03:00.481659588 +0000 UTC m=+3513.684154726" watchObservedRunningTime="2025-11-25 10:03:00.485577844 +0000 UTC m=+3513.688072982" Nov 25 10:03:03 crc kubenswrapper[4565]: I1125 10:03:03.727542 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:03 crc kubenswrapper[4565]: I1125 10:03:03.728547 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:03 crc kubenswrapper[4565]: I1125 10:03:03.808027 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:13 crc kubenswrapper[4565]: I1125 10:03:13.782461 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:13 crc kubenswrapper[4565]: I1125 10:03:13.857090 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:03:14 crc kubenswrapper[4565]: I1125 10:03:14.600209 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l65qw" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="registry-server" containerID="cri-o://a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c" gracePeriod=2 Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.465009 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.581774 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities\") pod \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.582175 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcmmw\" (UniqueName: \"kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw\") pod \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.582349 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content\") pod \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\" (UID: \"b957ecb9-83f8-47f7-af42-85961fdb9cf4\") " Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.586598 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities" (OuterVolumeSpecName: "utilities") pod "b957ecb9-83f8-47f7-af42-85961fdb9cf4" (UID: "b957ecb9-83f8-47f7-af42-85961fdb9cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.605175 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw" (OuterVolumeSpecName: "kube-api-access-qcmmw") pod "b957ecb9-83f8-47f7-af42-85961fdb9cf4" (UID: "b957ecb9-83f8-47f7-af42-85961fdb9cf4"). InnerVolumeSpecName "kube-api-access-qcmmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.641051 4565 generic.go:334] "Generic (PLEG): container finished" podID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerID="a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c" exitCode=0 Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.641098 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerDied","Data":"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c"} Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.641129 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65qw" event={"ID":"b957ecb9-83f8-47f7-af42-85961fdb9cf4","Type":"ContainerDied","Data":"8039d87f939ac292e91587c5364b05dcbf98e982d7bcea8ea2be7fe9a7dbaeb1"} Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.641147 4565 scope.go:117] "RemoveContainer" containerID="a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.641296 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65qw" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.669357 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b957ecb9-83f8-47f7-af42-85961fdb9cf4" (UID: "b957ecb9-83f8-47f7-af42-85961fdb9cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.687224 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcmmw\" (UniqueName: \"kubernetes.io/projected/b957ecb9-83f8-47f7-af42-85961fdb9cf4-kube-api-access-qcmmw\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.687268 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.687278 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b957ecb9-83f8-47f7-af42-85961fdb9cf4-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.696501 4565 scope.go:117] "RemoveContainer" containerID="33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.815908 4565 scope.go:117] "RemoveContainer" containerID="c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.934721 4565 scope.go:117] "RemoveContainer" containerID="a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c" Nov 25 10:03:15 crc kubenswrapper[4565]: E1125 10:03:15.938253 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c\": container with ID starting with a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c not found: ID does not exist" containerID="a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.938440 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c"} err="failed to get container status \"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c\": rpc error: code = NotFound desc = could not find container \"a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c\": container with ID starting with a0e04dd6b13ee62f30097336a0e47cb788ec91b4cccae72b13c8a65c4c2b875c not found: ID does not exist" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.938499 4565 scope.go:117] "RemoveContainer" containerID="33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76" Nov 25 10:03:15 crc kubenswrapper[4565]: E1125 10:03:15.948091 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76\": container with ID starting with 33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76 not found: ID does not exist" containerID="33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.948132 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76"} err="failed to get container status \"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76\": rpc error: code = NotFound desc = could not find container \"33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76\": container with ID starting with 33bbf66cb9aa9bfe311600f26cce6097761cc4f35cac42656f468e3fcf35ef76 not found: ID does not exist" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.948159 4565 scope.go:117] "RemoveContainer" containerID="c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49" Nov 25 10:03:15 crc kubenswrapper[4565]: E1125 10:03:15.950111 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49\": container with ID starting with c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49 not found: ID does not exist" containerID="c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49" Nov 25 10:03:15 crc kubenswrapper[4565]: I1125 10:03:15.950169 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49"} err="failed to get container status \"c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49\": rpc error: code = NotFound desc = could not find container \"c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49\": container with ID starting with c02a1d967076d2f02820e985bf49bdd6f20ff20693db53999f2d7be8a7f1cc49 not found: ID does not exist" Nov 25 10:03:16 crc kubenswrapper[4565]: I1125 10:03:16.005522 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:03:16 crc kubenswrapper[4565]: I1125 10:03:16.029590 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l65qw"] Nov 25 10:03:17 crc kubenswrapper[4565]: I1125 10:03:17.108094 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" path="/var/lib/kubelet/pods/b957ecb9-83f8-47f7-af42-85961fdb9cf4/volumes" Nov 25 10:03:25 crc kubenswrapper[4565]: I1125 10:03:25.099658 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:03:25 crc kubenswrapper[4565]: I1125 10:03:25.101110 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:03:38 crc kubenswrapper[4565]: I1125 10:03:38.925506 4565 generic.go:334] "Generic (PLEG): container finished" podID="63b1b7b6-0432-4292-aafa-ed7ee32210ed" containerID="b48759ab1a282ec541542b69dd8233a65a0610437c3216c5df05b40ae3a8c076" exitCode=0 Nov 25 10:03:38 crc kubenswrapper[4565]: I1125 10:03:38.925590 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" event={"ID":"63b1b7b6-0432-4292-aafa-ed7ee32210ed","Type":"ContainerDied","Data":"b48759ab1a282ec541542b69dd8233a65a0610437c3216c5df05b40ae3a8c076"} Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.011776 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.050980 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-qzdb8"] Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.064067 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-qzdb8"] Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.083024 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2qds\" (UniqueName: \"kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds\") pod \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.083327 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host\") pod \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\" (UID: \"63b1b7b6-0432-4292-aafa-ed7ee32210ed\") " Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.083464 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host" (OuterVolumeSpecName: "host") pod "63b1b7b6-0432-4292-aafa-ed7ee32210ed" (UID: "63b1b7b6-0432-4292-aafa-ed7ee32210ed"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.084410 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63b1b7b6-0432-4292-aafa-ed7ee32210ed-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.097071 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds" (OuterVolumeSpecName: "kube-api-access-r2qds") pod "63b1b7b6-0432-4292-aafa-ed7ee32210ed" (UID: "63b1b7b6-0432-4292-aafa-ed7ee32210ed"). InnerVolumeSpecName "kube-api-access-r2qds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.185379 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2qds\" (UniqueName: \"kubernetes.io/projected/63b1b7b6-0432-4292-aafa-ed7ee32210ed-kube-api-access-r2qds\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.944262 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3e5117eba0b67ece7b4f91bf579c4233d461347eaa97d7478b9d4b2876579c" Nov 25 10:03:40 crc kubenswrapper[4565]: I1125 10:03:40.944322 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-qzdb8" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.106235 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b1b7b6-0432-4292-aafa-ed7ee32210ed" path="/var/lib/kubelet/pods/63b1b7b6-0432-4292-aafa-ed7ee32210ed/volumes" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.450496 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zbfct"] Nov 25 10:03:41 crc kubenswrapper[4565]: E1125 10:03:41.450867 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="registry-server" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.450886 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="registry-server" Nov 25 10:03:41 crc kubenswrapper[4565]: E1125 10:03:41.450911 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="extract-content" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.450917 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="extract-content" Nov 25 10:03:41 crc kubenswrapper[4565]: E1125 10:03:41.450941 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="extract-utilities" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.450947 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="extract-utilities" Nov 25 10:03:41 crc kubenswrapper[4565]: E1125 10:03:41.450974 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b1b7b6-0432-4292-aafa-ed7ee32210ed" containerName="container-00" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.450979 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b1b7b6-0432-4292-aafa-ed7ee32210ed" containerName="container-00" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.451149 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b1b7b6-0432-4292-aafa-ed7ee32210ed" containerName="container-00" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.451170 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b957ecb9-83f8-47f7-af42-85961fdb9cf4" containerName="registry-server" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.451785 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.514670 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.514729 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grps\" (UniqueName: \"kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.617116 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.617178 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grps\" (UniqueName: \"kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.617281 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.633631 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grps\" (UniqueName: \"kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps\") pod \"crc-debug-zbfct\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.766966 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:41 crc kubenswrapper[4565]: I1125 10:03:41.957576 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" event={"ID":"78b76099-ac43-4b80-a819-09a5ab2a1b40","Type":"ContainerStarted","Data":"15082cb49c1ed68e5bfa8d32444da28f7d620b1e9510126ee29188136a24fe9b"} Nov 25 10:03:42 crc kubenswrapper[4565]: I1125 10:03:42.978119 4565 generic.go:334] "Generic (PLEG): container finished" podID="78b76099-ac43-4b80-a819-09a5ab2a1b40" containerID="bd61671f03c90477320bf60147a48be339f3a8759ac6e734e6317db663b2fd7f" exitCode=0 Nov 25 10:03:42 crc kubenswrapper[4565]: I1125 10:03:42.978220 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" event={"ID":"78b76099-ac43-4b80-a819-09a5ab2a1b40","Type":"ContainerDied","Data":"bd61671f03c90477320bf60147a48be339f3a8759ac6e734e6317db663b2fd7f"} Nov 25 10:03:43 crc kubenswrapper[4565]: I1125 10:03:43.324951 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zbfct"] Nov 25 10:03:43 crc kubenswrapper[4565]: I1125 10:03:43.332850 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zbfct"] Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.077832 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.274324 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6grps\" (UniqueName: \"kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps\") pod \"78b76099-ac43-4b80-a819-09a5ab2a1b40\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.274552 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host\") pod \"78b76099-ac43-4b80-a819-09a5ab2a1b40\" (UID: \"78b76099-ac43-4b80-a819-09a5ab2a1b40\") " Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.274626 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host" (OuterVolumeSpecName: "host") pod "78b76099-ac43-4b80-a819-09a5ab2a1b40" (UID: "78b76099-ac43-4b80-a819-09a5ab2a1b40"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.276183 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78b76099-ac43-4b80-a819-09a5ab2a1b40-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.288125 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps" (OuterVolumeSpecName: "kube-api-access-6grps") pod "78b76099-ac43-4b80-a819-09a5ab2a1b40" (UID: "78b76099-ac43-4b80-a819-09a5ab2a1b40"). InnerVolumeSpecName "kube-api-access-6grps". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.378904 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6grps\" (UniqueName: \"kubernetes.io/projected/78b76099-ac43-4b80-a819-09a5ab2a1b40-kube-api-access-6grps\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.508989 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zvhmg"] Nov 25 10:03:44 crc kubenswrapper[4565]: E1125 10:03:44.509379 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b76099-ac43-4b80-a819-09a5ab2a1b40" containerName="container-00" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.509396 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b76099-ac43-4b80-a819-09a5ab2a1b40" containerName="container-00" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.509610 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b76099-ac43-4b80-a819-09a5ab2a1b40" containerName="container-00" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.510290 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.685777 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.685865 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9ng\" (UniqueName: \"kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.788150 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9ng\" (UniqueName: \"kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.788321 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.788470 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.836360 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9ng\" (UniqueName: \"kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng\") pod \"crc-debug-zvhmg\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.998630 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15082cb49c1ed68e5bfa8d32444da28f7d620b1e9510126ee29188136a24fe9b" Nov 25 10:03:44 crc kubenswrapper[4565]: I1125 10:03:44.998676 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zbfct" Nov 25 10:03:45 crc kubenswrapper[4565]: I1125 10:03:45.108483 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b76099-ac43-4b80-a819-09a5ab2a1b40" path="/var/lib/kubelet/pods/78b76099-ac43-4b80-a819-09a5ab2a1b40/volumes" Nov 25 10:03:45 crc kubenswrapper[4565]: I1125 10:03:45.123952 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:46 crc kubenswrapper[4565]: I1125 10:03:46.009842 4565 generic.go:334] "Generic (PLEG): container finished" podID="b96701eb-417f-42c4-b805-06e71d3aec78" containerID="50a7b4b089e44bf6ec89259af7ab3a7712cb96db22b02f76b4d14c7d36392a89" exitCode=0 Nov 25 10:03:46 crc kubenswrapper[4565]: I1125 10:03:46.009970 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" event={"ID":"b96701eb-417f-42c4-b805-06e71d3aec78","Type":"ContainerDied","Data":"50a7b4b089e44bf6ec89259af7ab3a7712cb96db22b02f76b4d14c7d36392a89"} Nov 25 10:03:46 crc kubenswrapper[4565]: I1125 10:03:46.010271 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" event={"ID":"b96701eb-417f-42c4-b805-06e71d3aec78","Type":"ContainerStarted","Data":"1ef773695c006c1552dfd3f32243e2db01d293b151d59ff2114b1d861c95f838"} Nov 25 10:03:46 crc kubenswrapper[4565]: I1125 10:03:46.048845 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zvhmg"] Nov 25 10:03:46 crc kubenswrapper[4565]: I1125 10:03:46.064050 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qx9jc/crc-debug-zvhmg"] Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.110903 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.246356 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl9ng\" (UniqueName: \"kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng\") pod \"b96701eb-417f-42c4-b805-06e71d3aec78\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.246893 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host\") pod \"b96701eb-417f-42c4-b805-06e71d3aec78\" (UID: \"b96701eb-417f-42c4-b805-06e71d3aec78\") " Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.248563 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host" (OuterVolumeSpecName: "host") pod "b96701eb-417f-42c4-b805-06e71d3aec78" (UID: "b96701eb-417f-42c4-b805-06e71d3aec78"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.265167 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng" (OuterVolumeSpecName: "kube-api-access-dl9ng") pod "b96701eb-417f-42c4-b805-06e71d3aec78" (UID: "b96701eb-417f-42c4-b805-06e71d3aec78"). InnerVolumeSpecName "kube-api-access-dl9ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.351906 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl9ng\" (UniqueName: \"kubernetes.io/projected/b96701eb-417f-42c4-b805-06e71d3aec78-kube-api-access-dl9ng\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:47 crc kubenswrapper[4565]: I1125 10:03:47.351955 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96701eb-417f-42c4-b805-06e71d3aec78-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:03:48 crc kubenswrapper[4565]: I1125 10:03:48.026481 4565 scope.go:117] "RemoveContainer" containerID="50a7b4b089e44bf6ec89259af7ab3a7712cb96db22b02f76b4d14c7d36392a89" Nov 25 10:03:48 crc kubenswrapper[4565]: I1125 10:03:48.026614 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/crc-debug-zvhmg" Nov 25 10:03:49 crc kubenswrapper[4565]: I1125 10:03:49.105273 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96701eb-417f-42c4-b805-06e71d3aec78" path="/var/lib/kubelet/pods/b96701eb-417f-42c4-b805-06e71d3aec78/volumes" Nov 25 10:03:55 crc kubenswrapper[4565]: I1125 10:03:55.099090 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:03:55 crc kubenswrapper[4565]: I1125 10:03:55.099676 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:03:55 crc kubenswrapper[4565]: I1125 10:03:55.104950 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 10:03:55 crc kubenswrapper[4565]: I1125 10:03:55.105358 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 10:03:55 crc kubenswrapper[4565]: I1125 10:03:55.105422 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" gracePeriod=600 Nov 25 10:03:55 crc kubenswrapper[4565]: E1125 10:03:55.235716 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:03:56 crc kubenswrapper[4565]: I1125 10:03:56.104197 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" exitCode=0 Nov 25 10:03:56 crc kubenswrapper[4565]: I1125 10:03:56.104644 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c"} Nov 25 10:03:56 crc kubenswrapper[4565]: I1125 10:03:56.104691 4565 scope.go:117] "RemoveContainer" containerID="75b70cc61715acf4a6e67100e637f90a320593b52905b15de68119be8561ea55" Nov 25 10:03:56 crc kubenswrapper[4565]: I1125 10:03:56.105194 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:03:56 crc kubenswrapper[4565]: E1125 10:03:56.105663 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:04:09 crc kubenswrapper[4565]: I1125 10:04:09.097223 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:04:09 crc kubenswrapper[4565]: E1125 10:04:09.098026 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:04:20 crc kubenswrapper[4565]: I1125 10:04:20.097351 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:04:20 crc kubenswrapper[4565]: E1125 10:04:20.098245 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.131210 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b4c469f64-4jfxz_b577976e-309e-47cd-80a6-4f72547d912b/barbican-api/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.367490 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b4c469f64-4jfxz_b577976e-309e-47cd-80a6-4f72547d912b/barbican-api-log/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.460716 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58b56d8886-ghgt5_b3377fa1-4b60-4b8c-9efb-a266d872af91/barbican-keystone-listener/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.542668 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58b56d8886-ghgt5_b3377fa1-4b60-4b8c-9efb-a266d872af91/barbican-keystone-listener-log/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.606115 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66898497b9-mtk55_5ff2dc5b-9c32-43bc-b8c7-7341812d4160/barbican-worker/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.713207 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66898497b9-mtk55_5ff2dc5b-9c32-43bc-b8c7-7341812d4160/barbican-worker-log/0.log" Nov 25 10:04:23 crc kubenswrapper[4565]: I1125 10:04:23.906983 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f_3de99810-1335-466f-8386-e4ecbe49f3fd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.037673 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/ceilometer-central-agent/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.090218 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/ceilometer-notification-agent/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.175222 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/sg-core/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.181890 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/proxy-httpd/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.309914 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx_d22f3d2a-b6ff-4188-9778-e7108dd44f3a/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.433887 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v_6fcb5ee3-1258-4245-bd22-5aecd14a312c/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.707756 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd508fe3-6c52-4b41-a308-7f8697523a81/cinder-api-log/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.755842 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd508fe3-6c52-4b41-a308-7f8697523a81/cinder-api/0.log" Nov 25 10:04:24 crc kubenswrapper[4565]: I1125 10:04:24.963010 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d40080b6-5cb6-48e6-9625-9c8b821ed10b/cinder-backup/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.011086 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d40080b6-5cb6-48e6-9625-9c8b821ed10b/probe/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.080016 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_192bb64d-39b3-4dad-a57b-65afe8c7ec7e/cinder-scheduler/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.256277 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_192bb64d-39b3-4dad-a57b-65afe8c7ec7e/probe/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.343143 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3b9c251-0777-4463-916e-b6712e7a69b7/probe/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.386130 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3b9c251-0777-4463-916e-b6712e7a69b7/cinder-volume/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.640687 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lggx8_13f5faf5-45eb-46fc-b76b-59b8babba10c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.725442 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jq72p_49dcd4ab-323d-4499-97a6-69fe4e29a0a6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:25 crc kubenswrapper[4565]: I1125 10:04:25.934412 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/init/0.log" Nov 25 10:04:26 crc kubenswrapper[4565]: I1125 10:04:26.137792 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/init/0.log" Nov 25 10:04:26 crc kubenswrapper[4565]: I1125 10:04:26.276176 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e1ee609b-48dd-4e92-9b13-e9bf94768ead/glance-httpd/0.log" Nov 25 10:04:26 crc kubenswrapper[4565]: I1125 10:04:26.541327 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e1ee609b-48dd-4e92-9b13-e9bf94768ead/glance-log/0.log" Nov 25 10:04:26 crc kubenswrapper[4565]: I1125 10:04:26.634553 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_451d4f5d-1ecc-4633-a889-ea95473bc981/glance-httpd/0.log" Nov 25 10:04:26 crc kubenswrapper[4565]: I1125 10:04:26.846324 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_451d4f5d-1ecc-4633-a889-ea95473bc981/glance-log/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.020747 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6549bb6ccb-qd7ll_d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb/horizon/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.299082 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6549bb6ccb-qd7ll_d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb/horizon-log/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.377390 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-txnr9_e277484b-b7d0-4a20-9551-a4d62a9720ea/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.586231 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/dnsmasq-dns/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.660382 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-h9t4j_766382aa-dcdb-41f0-afb0-cb90d5ea8f31/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:27 crc kubenswrapper[4565]: I1125 10:04:27.766694 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-757f65c548-72sgl_187eb1ab-d8ed-4506-bc95-59cb1f61e285/keystone-api/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.040501 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29401081-lh62g_3749c5c9-d117-42da-bd30-bb593c5d1fb2/keystone-cron/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.172566 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5275621d-5c51-4586-85f2-e0e24cb32266/kube-state-metrics/3.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.246580 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5275621d-5c51-4586-85f2-e0e24cb32266/kube-state-metrics/2.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.345523 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xqn99_e1061e52-8553-4932-9689-83016e2b413f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.486027 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbdf37e4-2636-44d8-981a-4c960e37799e/manila-api/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.554411 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbdf37e4-2636-44d8-981a-4c960e37799e/manila-api-log/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.658062 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fef203b-c8bb-4fb3-9415-b042e6837bff/manila-scheduler/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.704725 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fef203b-c8bb-4fb3-9415-b042e6837bff/probe/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.829879 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_60d980f4-b1c6-4991-ae13-143dcb6bf453/probe/0.log" Nov 25 10:04:28 crc kubenswrapper[4565]: I1125 10:04:28.860887 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_60d980f4-b1c6-4991-ae13-143dcb6bf453/manila-share/0.log" Nov 25 10:04:29 crc kubenswrapper[4565]: I1125 10:04:29.231651 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4d697775-b8wbb_aca1b06e-7a63-4a42-873b-427de57cef7f/neutron-httpd/0.log" Nov 25 10:04:29 crc kubenswrapper[4565]: I1125 10:04:29.281485 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw_f5a1f544-cbde-40e4-aec7-72347718b75d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:29 crc kubenswrapper[4565]: I1125 10:04:29.286734 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4d697775-b8wbb_aca1b06e-7a63-4a42-873b-427de57cef7f/neutron-api/0.log" Nov 25 10:04:29 crc kubenswrapper[4565]: I1125 10:04:29.877440 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_13ff22ce-a715-4dba-aaa3-b6ba3a929d55/nova-api-log/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.000894 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ba666b44-183c-4752-8f43-899a921da911/nova-cell0-conductor-conductor/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.215694 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_26443d78-c2f9-4e62-9f77-69dbca9848f0/nova-cell1-conductor-conductor/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.282370 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_13ff22ce-a715-4dba-aaa3-b6ba3a929d55/nova-api-api/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.399359 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_42ef1e95-a44e-4dea-8127-228bd8065e0c/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.542844 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm_769230ff-fe55-4c62-bc60-73797b5fc1bb/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:30 crc kubenswrapper[4565]: I1125 10:04:30.726909 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1e090c67-47d1-445e-843f-4cf950699016/nova-metadata-log/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.097751 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:04:31 crc kubenswrapper[4565]: E1125 10:04:31.098080 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.190257 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/mysql-bootstrap/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.197807 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c528e51d-34ba-402c-8985-b53beb776f43/nova-scheduler-scheduler/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.374360 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/mysql-bootstrap/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.474635 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/galera/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.647882 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/mysql-bootstrap/0.log" Nov 25 10:04:31 crc kubenswrapper[4565]: I1125 10:04:31.917947 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/mysql-bootstrap/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.058830 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/galera/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.235201 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7dfd09a5-8627-4394-ac4f-367458ffe0b2/openstackclient/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.272328 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1e090c67-47d1-445e-843f-4cf950699016/nova-metadata-metadata/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.469969 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ct2r6_1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9/openstack-network-exporter/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.527251 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7k468_67e2fa61-acc9-415b-9e10-0a35b6a3feb7/ovn-controller/0.log" Nov 25 10:04:32 crc kubenswrapper[4565]: I1125 10:04:32.782021 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server-init/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.008660 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server-init/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.070129 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.136315 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovs-vswitchd/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.329538 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-m2rrn_eedd2b64-c2c0-43dd-a5d9-ee7508387909/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.347127 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abfe2157-f884-4325-8d80-7fa9b90754a9/openstack-network-exporter/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.439130 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abfe2157-f884-4325-8d80-7fa9b90754a9/ovn-northd/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.694089 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_68861e53-c198-4971-baf5-dd1653ef84ad/openstack-network-exporter/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.713252 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_68861e53-c198-4971-baf5-dd1653ef84ad/ovsdbserver-nb/0.log" Nov 25 10:04:33 crc kubenswrapper[4565]: I1125 10:04:33.982026 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_164519a3-6eaf-49ac-bc20-cd1a4b04d594/ovsdbserver-sb/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.014581 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_164519a3-6eaf-49ac-bc20-cd1a4b04d594/openstack-network-exporter/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.198284 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b84f9bd8-pdssf_7d87b83a-02e7-48d5-8f87-e127fe8ffe0b/placement-api/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.376124 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b84f9bd8-pdssf_7d87b83a-02e7-48d5-8f87-e127fe8ffe0b/placement-log/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.467770 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/setup-container/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.796870 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/setup-container/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.819780 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/setup-container/0.log" Nov 25 10:04:34 crc kubenswrapper[4565]: I1125 10:04:34.918095 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/rabbitmq/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.217086 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/setup-container/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.224806 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/rabbitmq/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.361624 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7_dccca567-2d50-4077-8a64-803dafa14ffb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.450706 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw_e29506cc-593c-49fd-b8eb-0ec1c0c8be5b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.671724 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kl4bm_41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.852624 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x54cf_ff542e2a-3788-42a5-8a29-66f22838511d/ssh-known-hosts-edpm-deployment/0.log" Nov 25 10:04:35 crc kubenswrapper[4565]: I1125 10:04:35.964769 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a3071a8a-a30b-4b2b-aea0-5882f4eff1b2/tempest-tests-tempest-tests-runner/0.log" Nov 25 10:04:36 crc kubenswrapper[4565]: I1125 10:04:36.143602 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1a5fdee8-1424-4227-a297-3c68d5463280/test-operator-logs-container/0.log" Nov 25 10:04:36 crc kubenswrapper[4565]: I1125 10:04:36.244499 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9_7b9ebd21-0421-42f3-a7e6-8f0038b8c07e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:04:42 crc kubenswrapper[4565]: I1125 10:04:42.096760 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:04:42 crc kubenswrapper[4565]: E1125 10:04:42.104244 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:04:48 crc kubenswrapper[4565]: I1125 10:04:48.771352 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_26db83c6-ee58-44da-bcb6-16989b77fba4/memcached/0.log" Nov 25 10:04:55 crc kubenswrapper[4565]: I1125 10:04:55.103667 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:04:55 crc kubenswrapper[4565]: E1125 10:04:55.104509 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.295502 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/kube-rbac-proxy/0.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.302768 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/manager/3.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.462213 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/manager/2.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.609457 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.672648 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.748636 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:05:04 crc kubenswrapper[4565]: I1125 10:05:04.754123 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.003880 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.020326 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.030194 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/extract/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.266382 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/manager/3.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.284842 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/manager/2.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.324164 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/kube-rbac-proxy/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.461857 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/kube-rbac-proxy/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.507122 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/manager/3.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.587408 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/manager/2.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.742815 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/manager/3.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.778429 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/kube-rbac-proxy/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.780360 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/manager/2.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.913619 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/kube-rbac-proxy/0.log" Nov 25 10:05:05 crc kubenswrapper[4565]: I1125 10:05:05.982833 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/manager/2.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.010383 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/manager/3.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.140293 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/kube-rbac-proxy/0.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.207921 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/manager/3.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.230418 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/manager/2.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.348264 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/kube-rbac-proxy/0.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.434545 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/manager/3.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.445155 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/manager/2.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.585084 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/kube-rbac-proxy/0.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.764307 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/manager/2.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.775206 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/manager/3.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.791644 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/kube-rbac-proxy/0.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.913183 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/manager/2.log" Nov 25 10:05:06 crc kubenswrapper[4565]: I1125 10:05:06.986237 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/manager/3.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.050944 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/kube-rbac-proxy/0.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.200906 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/manager/2.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.204219 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/manager/3.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.264263 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/kube-rbac-proxy/0.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.390966 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/manager/2.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.412915 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/manager/3.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.546466 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/kube-rbac-proxy/0.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.634143 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/manager/3.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.637102 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/manager/2.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.767722 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/kube-rbac-proxy/0.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.810877 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/manager/3.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.846584 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/manager/2.log" Nov 25 10:05:07 crc kubenswrapper[4565]: I1125 10:05:07.935583 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/kube-rbac-proxy/0.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.016673 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/manager/3.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.059745 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/manager/2.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.166375 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/kube-rbac-proxy/0.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.190442 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/manager/1.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.258967 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/manager/0.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.545955 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-fkc7l_579400cf-d71f-47f4-a98e-b94ccbf4ff72/manager/1.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.561033 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-s8c4s_0c32d371-4207-4e71-8031-a27b6562f9a2/operator/1.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.800297 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-fkc7l_579400cf-d71f-47f4-a98e-b94ccbf4ff72/manager/2.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.827150 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-s8c4s_0c32d371-4207-4e71-8031-a27b6562f9a2/operator/0.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.887016 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2hp6w_4b5856eb-4d4d-406d-bb20-cbc44a10e522/registry-server/0.log" Nov 25 10:05:08 crc kubenswrapper[4565]: I1125 10:05:08.977526 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.021512 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/manager/3.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.024350 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/manager/2.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.122742 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.188464 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/manager/3.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.275234 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/manager/2.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.306030 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4llp_a65931e1-7a1f-4251-9c4f-996b407dfb03/operator/3.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.322660 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4llp_a65931e1-7a1f-4251-9c4f-996b407dfb03/operator/2.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.439746 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.477339 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/manager/3.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.500124 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/manager/2.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.603598 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.635093 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/manager/3.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.698756 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/manager/2.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.762212 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.895092 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/manager/1.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.908620 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/manager/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.936188 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/kube-rbac-proxy/0.log" Nov 25 10:05:09 crc kubenswrapper[4565]: I1125 10:05:09.962184 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/manager/3.log" Nov 25 10:05:10 crc kubenswrapper[4565]: I1125 10:05:10.082679 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/manager/2.log" Nov 25 10:05:10 crc kubenswrapper[4565]: I1125 10:05:10.103140 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:05:10 crc kubenswrapper[4565]: E1125 10:05:10.104768 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:05:23 crc kubenswrapper[4565]: I1125 10:05:23.097465 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:05:23 crc kubenswrapper[4565]: E1125 10:05:23.098705 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:05:26 crc kubenswrapper[4565]: I1125 10:05:26.605276 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-smktm_00c23670-fc21-4730-a27e-ac490261f994/control-plane-machine-set-operator/0.log" Nov 25 10:05:26 crc kubenswrapper[4565]: I1125 10:05:26.806202 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8xkx9_262e9d06-4e95-4017-85d9-5657f520eb49/kube-rbac-proxy/0.log" Nov 25 10:05:27 crc kubenswrapper[4565]: I1125 10:05:27.083529 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8xkx9_262e9d06-4e95-4017-85d9-5657f520eb49/machine-api-operator/0.log" Nov 25 10:05:36 crc kubenswrapper[4565]: I1125 10:05:36.096921 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:05:36 crc kubenswrapper[4565]: E1125 10:05:36.097887 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:05:38 crc kubenswrapper[4565]: I1125 10:05:38.455208 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-h4n5f_bac4df7d-2428-4150-881b-5695b1cfbddd/cert-manager-controller/0.log" Nov 25 10:05:38 crc kubenswrapper[4565]: I1125 10:05:38.624284 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-fqzsz_9500e97b-07b7-43d8-bfdf-dab609ce7f67/cert-manager-cainjector/0.log" Nov 25 10:05:38 crc kubenswrapper[4565]: I1125 10:05:38.637316 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-vpzsg_be96e081-d820-40aa-81e9-bff6c2392110/cert-manager-webhook/0.log" Nov 25 10:05:50 crc kubenswrapper[4565]: I1125 10:05:50.098183 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:05:50 crc kubenswrapper[4565]: E1125 10:05:50.099632 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:05:50 crc kubenswrapper[4565]: I1125 10:05:50.574245 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-d25tx_538d2898-95e6-4651-89d1-d5cb979d7aab/nmstate-console-plugin/0.log" Nov 25 10:05:50 crc kubenswrapper[4565]: I1125 10:05:50.707232 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b57v9_26159181-25a7-4f96-8bf7-059faaff18e0/nmstate-handler/0.log" Nov 25 10:05:50 crc kubenswrapper[4565]: I1125 10:05:50.766128 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-gfhjf_5745386c-25f6-4be7-bdc7-c299e25185d4/kube-rbac-proxy/0.log" Nov 25 10:05:50 crc kubenswrapper[4565]: I1125 10:05:50.875817 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-gfhjf_5745386c-25f6-4be7-bdc7-c299e25185d4/nmstate-metrics/0.log" Nov 25 10:05:51 crc kubenswrapper[4565]: I1125 10:05:51.095228 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-xvfww_63b01418-682e-4ebe-874d-aab5928c222a/nmstate-operator/0.log" Nov 25 10:05:51 crc kubenswrapper[4565]: I1125 10:05:51.128654 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-bgc56_e12b0906-c6a2-468a-8bc1-a29bda6a25e3/nmstate-webhook/0.log" Nov 25 10:06:01 crc kubenswrapper[4565]: I1125 10:06:01.098906 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:06:01 crc kubenswrapper[4565]: E1125 10:06:01.099824 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.216657 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-gzvgq_2d23227d-3456-4a34-9aa7-4878c7ee4d37/kube-rbac-proxy/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.312026 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-gzvgq_2d23227d-3456-4a34-9aa7-4878c7ee4d37/controller/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.458411 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.607983 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.684136 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.684988 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.757039 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:06:05 crc kubenswrapper[4565]: I1125 10:06:05.959795 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.006528 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.011540 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.031466 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.202034 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.214383 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.257007 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.321480 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/controller/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.497559 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/frr-metrics/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.497685 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/kube-rbac-proxy/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.653707 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/kube-rbac-proxy-frr/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.952807 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/reloader/0.log" Nov 25 10:06:06 crc kubenswrapper[4565]: I1125 10:06:06.977008 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-b722z_23dd1cf1-30ed-4fc1-9b32-70897895e05d/frr-k8s-webhook-server/0.log" Nov 25 10:06:07 crc kubenswrapper[4565]: I1125 10:06:07.254649 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74454849f9-fjwfp_145e5d59-fd78-4bc1-a97c-17ebf0d67fa4/manager/3.log" Nov 25 10:06:07 crc kubenswrapper[4565]: I1125 10:06:07.312274 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74454849f9-fjwfp_145e5d59-fd78-4bc1-a97c-17ebf0d67fa4/manager/2.log" Nov 25 10:06:07 crc kubenswrapper[4565]: I1125 10:06:07.572065 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fb8cb44b7-5dvrd_14fec6d4-6935-4283-944d-6a229b3cdc82/webhook-server/0.log" Nov 25 10:06:07 crc kubenswrapper[4565]: I1125 10:06:07.712246 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/frr/0.log" Nov 25 10:06:08 crc kubenswrapper[4565]: I1125 10:06:08.289464 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dr2xf_1d95b8b8-1675-48ae-b497-ff3fcf0bbc42/kube-rbac-proxy/0.log" Nov 25 10:06:08 crc kubenswrapper[4565]: I1125 10:06:08.661350 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dr2xf_1d95b8b8-1675-48ae-b497-ff3fcf0bbc42/speaker/0.log" Nov 25 10:06:14 crc kubenswrapper[4565]: I1125 10:06:14.097195 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:06:14 crc kubenswrapper[4565]: E1125 10:06:14.099011 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:06:22 crc kubenswrapper[4565]: I1125 10:06:22.783021 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:06:22 crc kubenswrapper[4565]: I1125 10:06:22.976195 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:06:22 crc kubenswrapper[4565]: I1125 10:06:22.985451 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.062407 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.308518 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.341871 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.344945 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/extract/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.527516 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.714795 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.735694 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.768781 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:06:23 crc kubenswrapper[4565]: I1125 10:06:23.988675 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.025421 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.320847 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.392879 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/registry-server/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.754054 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.825958 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:06:24 crc kubenswrapper[4565]: I1125 10:06:24.868184 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.051879 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.074072 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.463429 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.569701 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/registry-server/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.671712 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.671826 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.745781 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:06:25 crc kubenswrapper[4565]: I1125 10:06:25.990893 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.013844 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/extract/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.041259 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.177771 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rgcml_5957e9ea-c2fe-43cb-9318-e22ae96c689c/marketplace-operator/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.247126 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.476027 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.485436 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.508107 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.735033 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.767300 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.879783 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/registry-server/0.log" Nov 25 10:06:26 crc kubenswrapper[4565]: I1125 10:06:26.917619 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.128017 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.147468 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.147475 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.300450 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.333575 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:06:27 crc kubenswrapper[4565]: I1125 10:06:27.753836 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/registry-server/0.log" Nov 25 10:06:28 crc kubenswrapper[4565]: I1125 10:06:28.097060 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:06:28 crc kubenswrapper[4565]: E1125 10:06:28.097433 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:06:41 crc kubenswrapper[4565]: I1125 10:06:41.098554 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:06:41 crc kubenswrapper[4565]: E1125 10:06:41.099356 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:06:53 crc kubenswrapper[4565]: I1125 10:06:53.097063 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:06:53 crc kubenswrapper[4565]: E1125 10:06:53.098331 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:07:05 crc kubenswrapper[4565]: I1125 10:07:05.100040 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:07:05 crc kubenswrapper[4565]: E1125 10:07:05.102030 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:07:19 crc kubenswrapper[4565]: I1125 10:07:19.097364 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:07:19 crc kubenswrapper[4565]: E1125 10:07:19.098437 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:07:31 crc kubenswrapper[4565]: I1125 10:07:31.097038 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:07:31 crc kubenswrapper[4565]: E1125 10:07:31.097959 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:07:45 crc kubenswrapper[4565]: I1125 10:07:45.097525 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:07:45 crc kubenswrapper[4565]: E1125 10:07:45.098527 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:07:57 crc kubenswrapper[4565]: I1125 10:07:57.104857 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:07:57 crc kubenswrapper[4565]: E1125 10:07:57.105872 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:08:08 crc kubenswrapper[4565]: I1125 10:08:08.097828 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:08:08 crc kubenswrapper[4565]: E1125 10:08:08.098739 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:08:20 crc kubenswrapper[4565]: I1125 10:08:20.097079 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:08:20 crc kubenswrapper[4565]: E1125 10:08:20.097999 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:08:34 crc kubenswrapper[4565]: I1125 10:08:34.098084 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:08:34 crc kubenswrapper[4565]: E1125 10:08:34.098923 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:08:38 crc kubenswrapper[4565]: I1125 10:08:38.974017 4565 generic.go:334] "Generic (PLEG): container finished" podID="ab256124-fb53-42eb-b130-47c81598e7b9" containerID="e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71" exitCode=0 Nov 25 10:08:38 crc kubenswrapper[4565]: I1125 10:08:38.974141 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" event={"ID":"ab256124-fb53-42eb-b130-47c81598e7b9","Type":"ContainerDied","Data":"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71"} Nov 25 10:08:38 crc kubenswrapper[4565]: I1125 10:08:38.975963 4565 scope.go:117] "RemoveContainer" containerID="e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71" Nov 25 10:08:39 crc kubenswrapper[4565]: I1125 10:08:39.237622 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qx9jc_must-gather-qvhhg_ab256124-fb53-42eb-b130-47c81598e7b9/gather/0.log" Nov 25 10:08:47 crc kubenswrapper[4565]: I1125 10:08:47.946475 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qx9jc/must-gather-qvhhg"] Nov 25 10:08:47 crc kubenswrapper[4565]: I1125 10:08:47.950174 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="copy" containerID="cri-o://e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05" gracePeriod=2 Nov 25 10:08:47 crc kubenswrapper[4565]: I1125 10:08:47.968151 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qx9jc/must-gather-qvhhg"] Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.458482 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qx9jc_must-gather-qvhhg_ab256124-fb53-42eb-b130-47c81598e7b9/copy/0.log" Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.459759 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.637118 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4\") pod \"ab256124-fb53-42eb-b130-47c81598e7b9\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.637640 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output\") pod \"ab256124-fb53-42eb-b130-47c81598e7b9\" (UID: \"ab256124-fb53-42eb-b130-47c81598e7b9\") " Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.646235 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4" (OuterVolumeSpecName: "kube-api-access-k56q4") pod "ab256124-fb53-42eb-b130-47c81598e7b9" (UID: "ab256124-fb53-42eb-b130-47c81598e7b9"). InnerVolumeSpecName "kube-api-access-k56q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.740471 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/ab256124-fb53-42eb-b130-47c81598e7b9-kube-api-access-k56q4\") on node \"crc\" DevicePath \"\"" Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.774781 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ab256124-fb53-42eb-b130-47c81598e7b9" (UID: "ab256124-fb53-42eb-b130-47c81598e7b9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:08:48 crc kubenswrapper[4565]: I1125 10:08:48.842850 4565 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab256124-fb53-42eb-b130-47c81598e7b9-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.074138 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qx9jc_must-gather-qvhhg_ab256124-fb53-42eb-b130-47c81598e7b9/copy/0.log" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.075269 4565 generic.go:334] "Generic (PLEG): container finished" podID="ab256124-fb53-42eb-b130-47c81598e7b9" containerID="e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05" exitCode=143 Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.075418 4565 scope.go:117] "RemoveContainer" containerID="e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.075647 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qx9jc/must-gather-qvhhg" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.109500 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:08:49 crc kubenswrapper[4565]: E1125 10:08:49.109877 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.129441 4565 scope.go:117] "RemoveContainer" containerID="e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.132053 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" path="/var/lib/kubelet/pods/ab256124-fb53-42eb-b130-47c81598e7b9/volumes" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.178328 4565 scope.go:117] "RemoveContainer" containerID="e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05" Nov 25 10:08:49 crc kubenswrapper[4565]: E1125 10:08:49.178967 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05\": container with ID starting with e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05 not found: ID does not exist" containerID="e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.179017 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05"} err="failed to get container status \"e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05\": rpc error: code = NotFound desc = could not find container \"e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05\": container with ID starting with e99880874bfa4d6f26d6dd61dd3921406d1c0a4789160765ad4a52e42c80fb05 not found: ID does not exist" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.179055 4565 scope.go:117] "RemoveContainer" containerID="e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71" Nov 25 10:08:49 crc kubenswrapper[4565]: E1125 10:08:49.182392 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71\": container with ID starting with e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71 not found: ID does not exist" containerID="e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71" Nov 25 10:08:49 crc kubenswrapper[4565]: I1125 10:08:49.182454 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71"} err="failed to get container status \"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71\": rpc error: code = NotFound desc = could not find container \"e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71\": container with ID starting with e3463d9f6a27bdfdec3dad6d18e17d7453b6a4d195fc5c80b5437556f2b55e71 not found: ID does not exist" Nov 25 10:09:02 crc kubenswrapper[4565]: I1125 10:09:02.097566 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:09:03 crc kubenswrapper[4565]: I1125 10:09:03.214144 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59"} Nov 25 10:09:37 crc kubenswrapper[4565]: I1125 10:09:37.920565 4565 scope.go:117] "RemoveContainer" containerID="b48759ab1a282ec541542b69dd8233a65a0610437c3216c5df05b40ae3a8c076" Nov 25 10:10:38 crc kubenswrapper[4565]: I1125 10:10:38.006887 4565 scope.go:117] "RemoveContainer" containerID="bd61671f03c90477320bf60147a48be339f3a8759ac6e734e6317db663b2fd7f" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.623885 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:10:48 crc kubenswrapper[4565]: E1125 10:10:48.624725 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="gather" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624741 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="gather" Nov 25 10:10:48 crc kubenswrapper[4565]: E1125 10:10:48.624764 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96701eb-417f-42c4-b805-06e71d3aec78" containerName="container-00" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624769 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96701eb-417f-42c4-b805-06e71d3aec78" containerName="container-00" Nov 25 10:10:48 crc kubenswrapper[4565]: E1125 10:10:48.624785 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="copy" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624790 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="copy" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624956 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="copy" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624972 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab256124-fb53-42eb-b130-47c81598e7b9" containerName="gather" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.624982 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96701eb-417f-42c4-b805-06e71d3aec78" containerName="container-00" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.626094 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.678634 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.764445 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.764530 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.764567 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj8s\" (UniqueName: \"kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.866717 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.866842 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xj8s\" (UniqueName: \"kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.866974 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.867466 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.867707 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.889057 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xj8s\" (UniqueName: \"kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s\") pod \"community-operators-s5zgl\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:48 crc kubenswrapper[4565]: I1125 10:10:48.942398 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:49 crc kubenswrapper[4565]: I1125 10:10:49.474466 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:10:50 crc kubenswrapper[4565]: I1125 10:10:50.235494 4565 generic.go:334] "Generic (PLEG): container finished" podID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerID="0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1" exitCode=0 Nov 25 10:10:50 crc kubenswrapper[4565]: I1125 10:10:50.235704 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerDied","Data":"0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1"} Nov 25 10:10:50 crc kubenswrapper[4565]: I1125 10:10:50.235850 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerStarted","Data":"d88425fd7633938fd7199a829d0c07d98d1df8bca3f0e4016089ea1c2af3e5fd"} Nov 25 10:10:50 crc kubenswrapper[4565]: I1125 10:10:50.239261 4565 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 10:10:51 crc kubenswrapper[4565]: I1125 10:10:51.249592 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerStarted","Data":"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73"} Nov 25 10:10:52 crc kubenswrapper[4565]: I1125 10:10:52.276582 4565 generic.go:334] "Generic (PLEG): container finished" podID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerID="f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73" exitCode=0 Nov 25 10:10:52 crc kubenswrapper[4565]: I1125 10:10:52.279014 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerDied","Data":"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73"} Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.211018 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.213169 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.223400 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.282587 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm87l\" (UniqueName: \"kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.282802 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.283133 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.290322 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerStarted","Data":"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c"} Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.309268 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s5zgl" podStartSLOduration=2.767253497 podStartE2EDuration="5.309252984s" podCreationTimestamp="2025-11-25 10:10:48 +0000 UTC" firstStartedPulling="2025-11-25 10:10:50.238988999 +0000 UTC m=+3983.441484137" lastFinishedPulling="2025-11-25 10:10:52.780988486 +0000 UTC m=+3985.983483624" observedRunningTime="2025-11-25 10:10:53.306027514 +0000 UTC m=+3986.508522642" watchObservedRunningTime="2025-11-25 10:10:53.309252984 +0000 UTC m=+3986.511748122" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.384151 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.384541 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm87l\" (UniqueName: \"kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.384655 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.384643 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.385018 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.411487 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm87l\" (UniqueName: \"kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l\") pod \"redhat-marketplace-ghgcv\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:53 crc kubenswrapper[4565]: I1125 10:10:53.529144 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.122037 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:10:54 crc kubenswrapper[4565]: W1125 10:10:54.130630 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76042ab9_7ae8_4caa_bcb3_97daaa72db31.slice/crio-eae15fe669214dee6328b36afb9af352d77092e20d0b67ff5d011f289946c91d WatchSource:0}: Error finding container eae15fe669214dee6328b36afb9af352d77092e20d0b67ff5d011f289946c91d: Status 404 returned error can't find the container with id eae15fe669214dee6328b36afb9af352d77092e20d0b67ff5d011f289946c91d Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.305983 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerStarted","Data":"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855"} Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.306327 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerStarted","Data":"eae15fe669214dee6328b36afb9af352d77092e20d0b67ff5d011f289946c91d"} Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.582956 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tm2p/must-gather-ldl4x"] Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.589609 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.599821 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5tm2p"/"openshift-service-ca.crt" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.602183 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5tm2p"/"default-dockercfg-qts56" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.604632 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5tm2p"/"kube-root-ca.crt" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.630757 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqs28\" (UniqueName: \"kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.631046 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.710569 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5tm2p/must-gather-ldl4x"] Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.733579 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.733843 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqs28\" (UniqueName: \"kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.734036 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.750819 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqs28\" (UniqueName: \"kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28\") pod \"must-gather-ldl4x\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:54 crc kubenswrapper[4565]: I1125 10:10:54.912821 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:10:55 crc kubenswrapper[4565]: I1125 10:10:55.322890 4565 generic.go:334] "Generic (PLEG): container finished" podID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerID="332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855" exitCode=0 Nov 25 10:10:55 crc kubenswrapper[4565]: I1125 10:10:55.324412 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerDied","Data":"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855"} Nov 25 10:10:55 crc kubenswrapper[4565]: I1125 10:10:55.536221 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5tm2p/must-gather-ldl4x"] Nov 25 10:10:56 crc kubenswrapper[4565]: I1125 10:10:56.344778 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerStarted","Data":"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a"} Nov 25 10:10:56 crc kubenswrapper[4565]: I1125 10:10:56.347723 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" event={"ID":"5131b53e-f713-4c17-b497-89ec226e9df6","Type":"ContainerStarted","Data":"6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e"} Nov 25 10:10:56 crc kubenswrapper[4565]: I1125 10:10:56.347780 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" event={"ID":"5131b53e-f713-4c17-b497-89ec226e9df6","Type":"ContainerStarted","Data":"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a"} Nov 25 10:10:56 crc kubenswrapper[4565]: I1125 10:10:56.347796 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" event={"ID":"5131b53e-f713-4c17-b497-89ec226e9df6","Type":"ContainerStarted","Data":"d404ef39df44ce6b57e4dfd47194cd0e43b11c5487127e4eea6bf5c2b9c03a38"} Nov 25 10:10:56 crc kubenswrapper[4565]: I1125 10:10:56.398083 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" podStartSLOduration=2.3980548 podStartE2EDuration="2.3980548s" podCreationTimestamp="2025-11-25 10:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 10:10:56.382330663 +0000 UTC m=+3989.584825800" watchObservedRunningTime="2025-11-25 10:10:56.3980548 +0000 UTC m=+3989.600549939" Nov 25 10:10:57 crc kubenswrapper[4565]: I1125 10:10:57.360899 4565 generic.go:334] "Generic (PLEG): container finished" podID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerID="89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a" exitCode=0 Nov 25 10:10:57 crc kubenswrapper[4565]: I1125 10:10:57.361003 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerDied","Data":"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a"} Nov 25 10:10:58 crc kubenswrapper[4565]: I1125 10:10:58.373997 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerStarted","Data":"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3"} Nov 25 10:10:58 crc kubenswrapper[4565]: I1125 10:10:58.417519 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghgcv" podStartSLOduration=2.881589852 podStartE2EDuration="5.417499391s" podCreationTimestamp="2025-11-25 10:10:53 +0000 UTC" firstStartedPulling="2025-11-25 10:10:55.325114933 +0000 UTC m=+3988.527610071" lastFinishedPulling="2025-11-25 10:10:57.861024471 +0000 UTC m=+3991.063519610" observedRunningTime="2025-11-25 10:10:58.393275292 +0000 UTC m=+3991.595770430" watchObservedRunningTime="2025-11-25 10:10:58.417499391 +0000 UTC m=+3991.619994530" Nov 25 10:10:58 crc kubenswrapper[4565]: I1125 10:10:58.943533 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:58 crc kubenswrapper[4565]: I1125 10:10:58.943871 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:59 crc kubenswrapper[4565]: I1125 10:10:59.144628 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:10:59 crc kubenswrapper[4565]: I1125 10:10:59.481365 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:11:00 crc kubenswrapper[4565]: I1125 10:11:00.605363 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.399401 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s5zgl" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="registry-server" containerID="cri-o://840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c" gracePeriod=2 Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.472174 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-cqm4x"] Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.473626 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.508665 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbzzz\" (UniqueName: \"kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.508726 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.612208 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbzzz\" (UniqueName: \"kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.613660 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.613790 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.657555 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbzzz\" (UniqueName: \"kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz\") pod \"crc-debug-cqm4x\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.793948 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:01 crc kubenswrapper[4565]: I1125 10:11:01.994882 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.024969 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities\") pod \"844d82c9-70cb-404f-bcb9-3125b2d37407\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.025359 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content\") pod \"844d82c9-70cb-404f-bcb9-3125b2d37407\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.025636 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xj8s\" (UniqueName: \"kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s\") pod \"844d82c9-70cb-404f-bcb9-3125b2d37407\" (UID: \"844d82c9-70cb-404f-bcb9-3125b2d37407\") " Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.026862 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities" (OuterVolumeSpecName: "utilities") pod "844d82c9-70cb-404f-bcb9-3125b2d37407" (UID: "844d82c9-70cb-404f-bcb9-3125b2d37407"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.038552 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s" (OuterVolumeSpecName: "kube-api-access-2xj8s") pod "844d82c9-70cb-404f-bcb9-3125b2d37407" (UID: "844d82c9-70cb-404f-bcb9-3125b2d37407"). InnerVolumeSpecName "kube-api-access-2xj8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.071510 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "844d82c9-70cb-404f-bcb9-3125b2d37407" (UID: "844d82c9-70cb-404f-bcb9-3125b2d37407"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.128818 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.128860 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xj8s\" (UniqueName: \"kubernetes.io/projected/844d82c9-70cb-404f-bcb9-3125b2d37407-kube-api-access-2xj8s\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.128874 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844d82c9-70cb-404f-bcb9-3125b2d37407-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.409536 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" event={"ID":"c32ccd55-da46-4fc1-ae60-44a31ff629c4","Type":"ContainerStarted","Data":"56d2cdd7b234e0a823aca6005963b6963bc6d7259845b7b4c6c8e6c646556964"} Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.409821 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" event={"ID":"c32ccd55-da46-4fc1-ae60-44a31ff629c4","Type":"ContainerStarted","Data":"143abd997cefdbc02d2dcc04e5b6eaf9879f0d2d4f48a219d972fa61a7bde1e6"} Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.415737 4565 generic.go:334] "Generic (PLEG): container finished" podID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerID="840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c" exitCode=0 Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.415770 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerDied","Data":"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c"} Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.415788 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5zgl" event={"ID":"844d82c9-70cb-404f-bcb9-3125b2d37407","Type":"ContainerDied","Data":"d88425fd7633938fd7199a829d0c07d98d1df8bca3f0e4016089ea1c2af3e5fd"} Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.415808 4565 scope.go:117] "RemoveContainer" containerID="840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.415938 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5zgl" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.427523 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" podStartSLOduration=1.4275040319999999 podStartE2EDuration="1.427504032s" podCreationTimestamp="2025-11-25 10:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 10:11:02.422176541 +0000 UTC m=+3995.624671679" watchObservedRunningTime="2025-11-25 10:11:02.427504032 +0000 UTC m=+3995.629999170" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.448415 4565 scope.go:117] "RemoveContainer" containerID="f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.470480 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.488141 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s5zgl"] Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.489167 4565 scope.go:117] "RemoveContainer" containerID="0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.523177 4565 scope.go:117] "RemoveContainer" containerID="840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c" Nov 25 10:11:02 crc kubenswrapper[4565]: E1125 10:11:02.524004 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c\": container with ID starting with 840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c not found: ID does not exist" containerID="840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.524052 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c"} err="failed to get container status \"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c\": rpc error: code = NotFound desc = could not find container \"840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c\": container with ID starting with 840daa5ddcc1b7e7b4a8db2d1b1c96ce34a7b30a3656128bda689eb0e7f00e2c not found: ID does not exist" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.524095 4565 scope.go:117] "RemoveContainer" containerID="f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73" Nov 25 10:11:02 crc kubenswrapper[4565]: E1125 10:11:02.524543 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73\": container with ID starting with f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73 not found: ID does not exist" containerID="f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.524571 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73"} err="failed to get container status \"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73\": rpc error: code = NotFound desc = could not find container \"f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73\": container with ID starting with f662f004d2201c5faf7181549a2906a61f1f2ae798d8197f25b1b5df28533c73 not found: ID does not exist" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.524586 4565 scope.go:117] "RemoveContainer" containerID="0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1" Nov 25 10:11:02 crc kubenswrapper[4565]: E1125 10:11:02.524801 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1\": container with ID starting with 0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1 not found: ID does not exist" containerID="0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1" Nov 25 10:11:02 crc kubenswrapper[4565]: I1125 10:11:02.524825 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1"} err="failed to get container status \"0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1\": rpc error: code = NotFound desc = could not find container \"0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1\": container with ID starting with 0a373fba33ddf16e6afc5002fcc4c98a829226974f8d51ec4d1996c2f97037e1 not found: ID does not exist" Nov 25 10:11:03 crc kubenswrapper[4565]: I1125 10:11:03.108698 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" path="/var/lib/kubelet/pods/844d82c9-70cb-404f-bcb9-3125b2d37407/volumes" Nov 25 10:11:03 crc kubenswrapper[4565]: I1125 10:11:03.529711 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:03 crc kubenswrapper[4565]: I1125 10:11:03.529874 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:03 crc kubenswrapper[4565]: I1125 10:11:03.587130 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:04 crc kubenswrapper[4565]: I1125 10:11:04.486858 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:05 crc kubenswrapper[4565]: I1125 10:11:05.010208 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:11:06 crc kubenswrapper[4565]: I1125 10:11:06.467681 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghgcv" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="registry-server" containerID="cri-o://df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3" gracePeriod=2 Nov 25 10:11:06 crc kubenswrapper[4565]: I1125 10:11:06.974703 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.053805 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities\") pod \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.054158 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content\") pod \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.054271 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm87l\" (UniqueName: \"kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l\") pod \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\" (UID: \"76042ab9-7ae8-4caa-bcb3-97daaa72db31\") " Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.064641 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities" (OuterVolumeSpecName: "utilities") pod "76042ab9-7ae8-4caa-bcb3-97daaa72db31" (UID: "76042ab9-7ae8-4caa-bcb3-97daaa72db31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.070510 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76042ab9-7ae8-4caa-bcb3-97daaa72db31" (UID: "76042ab9-7ae8-4caa-bcb3-97daaa72db31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.079137 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l" (OuterVolumeSpecName: "kube-api-access-qm87l") pod "76042ab9-7ae8-4caa-bcb3-97daaa72db31" (UID: "76042ab9-7ae8-4caa-bcb3-97daaa72db31"). InnerVolumeSpecName "kube-api-access-qm87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.159002 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.159039 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm87l\" (UniqueName: \"kubernetes.io/projected/76042ab9-7ae8-4caa-bcb3-97daaa72db31-kube-api-access-qm87l\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.159055 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76042ab9-7ae8-4caa-bcb3-97daaa72db31-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.480320 4565 generic.go:334] "Generic (PLEG): container finished" podID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerID="df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3" exitCode=0 Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.480402 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerDied","Data":"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3"} Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.480439 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgcv" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.480487 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgcv" event={"ID":"76042ab9-7ae8-4caa-bcb3-97daaa72db31","Type":"ContainerDied","Data":"eae15fe669214dee6328b36afb9af352d77092e20d0b67ff5d011f289946c91d"} Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.480517 4565 scope.go:117] "RemoveContainer" containerID="df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.518670 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.528485 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgcv"] Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.532557 4565 scope.go:117] "RemoveContainer" containerID="89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.562997 4565 scope.go:117] "RemoveContainer" containerID="332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.592121 4565 scope.go:117] "RemoveContainer" containerID="df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3" Nov 25 10:11:07 crc kubenswrapper[4565]: E1125 10:11:07.593111 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3\": container with ID starting with df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3 not found: ID does not exist" containerID="df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.593150 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3"} err="failed to get container status \"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3\": rpc error: code = NotFound desc = could not find container \"df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3\": container with ID starting with df59234170c99eaafe94e34abcd99f4a44d36fce584c276ebef155ff6cf353c3 not found: ID does not exist" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.593176 4565 scope.go:117] "RemoveContainer" containerID="89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a" Nov 25 10:11:07 crc kubenswrapper[4565]: E1125 10:11:07.593580 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a\": container with ID starting with 89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a not found: ID does not exist" containerID="89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.593606 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a"} err="failed to get container status \"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a\": rpc error: code = NotFound desc = could not find container \"89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a\": container with ID starting with 89c55b61c4fa111bc1d4ea5ab22efa6f4de744abbce03e63154b4cd6f5654d9a not found: ID does not exist" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.593625 4565 scope.go:117] "RemoveContainer" containerID="332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855" Nov 25 10:11:07 crc kubenswrapper[4565]: E1125 10:11:07.593898 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855\": container with ID starting with 332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855 not found: ID does not exist" containerID="332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855" Nov 25 10:11:07 crc kubenswrapper[4565]: I1125 10:11:07.593944 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855"} err="failed to get container status \"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855\": rpc error: code = NotFound desc = could not find container \"332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855\": container with ID starting with 332e8b93c99a9a1070f4aab766ac6022f436169730c8cc423a8287135c6dd855 not found: ID does not exist" Nov 25 10:11:09 crc kubenswrapper[4565]: I1125 10:11:09.107049 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" path="/var/lib/kubelet/pods/76042ab9-7ae8-4caa-bcb3-97daaa72db31/volumes" Nov 25 10:11:25 crc kubenswrapper[4565]: I1125 10:11:25.099164 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:11:25 crc kubenswrapper[4565]: I1125 10:11:25.100700 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:11:29 crc kubenswrapper[4565]: I1125 10:11:29.682199 4565 generic.go:334] "Generic (PLEG): container finished" podID="c32ccd55-da46-4fc1-ae60-44a31ff629c4" containerID="56d2cdd7b234e0a823aca6005963b6963bc6d7259845b7b4c6c8e6c646556964" exitCode=0 Nov 25 10:11:29 crc kubenswrapper[4565]: I1125 10:11:29.682373 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" event={"ID":"c32ccd55-da46-4fc1-ae60-44a31ff629c4","Type":"ContainerDied","Data":"56d2cdd7b234e0a823aca6005963b6963bc6d7259845b7b4c6c8e6c646556964"} Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.782199 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.817664 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-cqm4x"] Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.824784 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-cqm4x"] Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.928500 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host\") pod \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.928604 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbzzz\" (UniqueName: \"kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz\") pod \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\" (UID: \"c32ccd55-da46-4fc1-ae60-44a31ff629c4\") " Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.929990 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host" (OuterVolumeSpecName: "host") pod "c32ccd55-da46-4fc1-ae60-44a31ff629c4" (UID: "c32ccd55-da46-4fc1-ae60-44a31ff629c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:11:30 crc kubenswrapper[4565]: I1125 10:11:30.936325 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz" (OuterVolumeSpecName: "kube-api-access-kbzzz") pod "c32ccd55-da46-4fc1-ae60-44a31ff629c4" (UID: "c32ccd55-da46-4fc1-ae60-44a31ff629c4"). InnerVolumeSpecName "kube-api-access-kbzzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:11:31 crc kubenswrapper[4565]: I1125 10:11:31.030873 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32ccd55-da46-4fc1-ae60-44a31ff629c4-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:31 crc kubenswrapper[4565]: I1125 10:11:31.030916 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbzzz\" (UniqueName: \"kubernetes.io/projected/c32ccd55-da46-4fc1-ae60-44a31ff629c4-kube-api-access-kbzzz\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:31 crc kubenswrapper[4565]: I1125 10:11:31.106626 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32ccd55-da46-4fc1-ae60-44a31ff629c4" path="/var/lib/kubelet/pods/c32ccd55-da46-4fc1-ae60-44a31ff629c4/volumes" Nov 25 10:11:31 crc kubenswrapper[4565]: I1125 10:11:31.707300 4565 scope.go:117] "RemoveContainer" containerID="56d2cdd7b234e0a823aca6005963b6963bc6d7259845b7b4c6c8e6c646556964" Nov 25 10:11:31 crc kubenswrapper[4565]: I1125 10:11:31.707342 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-cqm4x" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.213810 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-78qxw"] Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214458 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="extract-content" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214474 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="extract-content" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214494 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="extract-utilities" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214501 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="extract-utilities" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214512 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214519 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214541 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="extract-utilities" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214546 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="extract-utilities" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214562 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32ccd55-da46-4fc1-ae60-44a31ff629c4" containerName="container-00" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214568 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32ccd55-da46-4fc1-ae60-44a31ff629c4" containerName="container-00" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214583 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="extract-content" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214589 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="extract-content" Nov 25 10:11:32 crc kubenswrapper[4565]: E1125 10:11:32.214600 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214606 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214788 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="844d82c9-70cb-404f-bcb9-3125b2d37407" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214800 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32ccd55-da46-4fc1-ae60-44a31ff629c4" containerName="container-00" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.214821 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="76042ab9-7ae8-4caa-bcb3-97daaa72db31" containerName="registry-server" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.215451 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.365190 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.365566 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlvw\" (UniqueName: \"kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.469207 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.469354 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlvw\" (UniqueName: \"kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.469468 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.488073 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlvw\" (UniqueName: \"kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw\") pod \"crc-debug-78qxw\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.531479 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:32 crc kubenswrapper[4565]: W1125 10:11:32.570680 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03dae19_da9f_4713_99c5_c0a4a3deb625.slice/crio-eb583e59ce9c6a78286ccf7a99f4b1e455e54c7fc5924f973a65f737b0580e5d WatchSource:0}: Error finding container eb583e59ce9c6a78286ccf7a99f4b1e455e54c7fc5924f973a65f737b0580e5d: Status 404 returned error can't find the container with id eb583e59ce9c6a78286ccf7a99f4b1e455e54c7fc5924f973a65f737b0580e5d Nov 25 10:11:32 crc kubenswrapper[4565]: I1125 10:11:32.717538 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" event={"ID":"a03dae19-da9f-4713-99c5-c0a4a3deb625","Type":"ContainerStarted","Data":"eb583e59ce9c6a78286ccf7a99f4b1e455e54c7fc5924f973a65f737b0580e5d"} Nov 25 10:11:33 crc kubenswrapper[4565]: I1125 10:11:33.729970 4565 generic.go:334] "Generic (PLEG): container finished" podID="a03dae19-da9f-4713-99c5-c0a4a3deb625" containerID="bdc5f96177b92c797138f624a8211499623ad4aabc6094ed2951b02ed399f975" exitCode=0 Nov 25 10:11:33 crc kubenswrapper[4565]: I1125 10:11:33.730061 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" event={"ID":"a03dae19-da9f-4713-99c5-c0a4a3deb625","Type":"ContainerDied","Data":"bdc5f96177b92c797138f624a8211499623ad4aabc6094ed2951b02ed399f975"} Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.146147 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-78qxw"] Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.156776 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-78qxw"] Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.824952 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.928091 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wlvw\" (UniqueName: \"kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw\") pod \"a03dae19-da9f-4713-99c5-c0a4a3deb625\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.928182 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host\") pod \"a03dae19-da9f-4713-99c5-c0a4a3deb625\" (UID: \"a03dae19-da9f-4713-99c5-c0a4a3deb625\") " Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.928269 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host" (OuterVolumeSpecName: "host") pod "a03dae19-da9f-4713-99c5-c0a4a3deb625" (UID: "a03dae19-da9f-4713-99c5-c0a4a3deb625"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.928533 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03dae19-da9f-4713-99c5-c0a4a3deb625-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:34 crc kubenswrapper[4565]: I1125 10:11:34.936792 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw" (OuterVolumeSpecName: "kube-api-access-2wlvw") pod "a03dae19-da9f-4713-99c5-c0a4a3deb625" (UID: "a03dae19-da9f-4713-99c5-c0a4a3deb625"). InnerVolumeSpecName "kube-api-access-2wlvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.031015 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wlvw\" (UniqueName: \"kubernetes.io/projected/a03dae19-da9f-4713-99c5-c0a4a3deb625-kube-api-access-2wlvw\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.106409 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03dae19-da9f-4713-99c5-c0a4a3deb625" path="/var/lib/kubelet/pods/a03dae19-da9f-4713-99c5-c0a4a3deb625/volumes" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.369417 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-q6b8r"] Nov 25 10:11:35 crc kubenswrapper[4565]: E1125 10:11:35.369777 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03dae19-da9f-4713-99c5-c0a4a3deb625" containerName="container-00" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.369794 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03dae19-da9f-4713-99c5-c0a4a3deb625" containerName="container-00" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.369975 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03dae19-da9f-4713-99c5-c0a4a3deb625" containerName="container-00" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.370603 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.548403 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrk8f\" (UniqueName: \"kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.548905 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.651434 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.651587 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrk8f\" (UniqueName: \"kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.651650 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.670269 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrk8f\" (UniqueName: \"kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f\") pod \"crc-debug-q6b8r\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.686503 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:35 crc kubenswrapper[4565]: W1125 10:11:35.743134 4565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4f21b5_182c_4470_9726_68f870ff0e51.slice/crio-ef64cb09913de790b6e38b24c46e84167caa1c45172d44fbae2ed0e528b8959a WatchSource:0}: Error finding container ef64cb09913de790b6e38b24c46e84167caa1c45172d44fbae2ed0e528b8959a: Status 404 returned error can't find the container with id ef64cb09913de790b6e38b24c46e84167caa1c45172d44fbae2ed0e528b8959a Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.754825 4565 scope.go:117] "RemoveContainer" containerID="bdc5f96177b92c797138f624a8211499623ad4aabc6094ed2951b02ed399f975" Nov 25 10:11:35 crc kubenswrapper[4565]: I1125 10:11:35.754967 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-78qxw" Nov 25 10:11:36 crc kubenswrapper[4565]: I1125 10:11:36.767311 4565 generic.go:334] "Generic (PLEG): container finished" podID="ab4f21b5-182c-4470-9726-68f870ff0e51" containerID="106b30613e8c5fb2ddde99a4d74b9a552acc5ca7ac8dc311b472362818779943" exitCode=0 Nov 25 10:11:36 crc kubenswrapper[4565]: I1125 10:11:36.767409 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" event={"ID":"ab4f21b5-182c-4470-9726-68f870ff0e51","Type":"ContainerDied","Data":"106b30613e8c5fb2ddde99a4d74b9a552acc5ca7ac8dc311b472362818779943"} Nov 25 10:11:36 crc kubenswrapper[4565]: I1125 10:11:36.767691 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" event={"ID":"ab4f21b5-182c-4470-9726-68f870ff0e51","Type":"ContainerStarted","Data":"ef64cb09913de790b6e38b24c46e84167caa1c45172d44fbae2ed0e528b8959a"} Nov 25 10:11:36 crc kubenswrapper[4565]: I1125 10:11:36.800308 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-q6b8r"] Nov 25 10:11:36 crc kubenswrapper[4565]: I1125 10:11:36.810769 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tm2p/crc-debug-q6b8r"] Nov 25 10:11:37 crc kubenswrapper[4565]: I1125 10:11:37.867617 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.019333 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host\") pod \"ab4f21b5-182c-4470-9726-68f870ff0e51\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.019557 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrk8f\" (UniqueName: \"kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f\") pod \"ab4f21b5-182c-4470-9726-68f870ff0e51\" (UID: \"ab4f21b5-182c-4470-9726-68f870ff0e51\") " Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.019654 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host" (OuterVolumeSpecName: "host") pod "ab4f21b5-182c-4470-9726-68f870ff0e51" (UID: "ab4f21b5-182c-4470-9726-68f870ff0e51"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.020195 4565 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4f21b5-182c-4470-9726-68f870ff0e51-host\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.025189 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f" (OuterVolumeSpecName: "kube-api-access-lrk8f") pod "ab4f21b5-182c-4470-9726-68f870ff0e51" (UID: "ab4f21b5-182c-4470-9726-68f870ff0e51"). InnerVolumeSpecName "kube-api-access-lrk8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.123239 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrk8f\" (UniqueName: \"kubernetes.io/projected/ab4f21b5-182c-4470-9726-68f870ff0e51-kube-api-access-lrk8f\") on node \"crc\" DevicePath \"\"" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.790576 4565 scope.go:117] "RemoveContainer" containerID="106b30613e8c5fb2ddde99a4d74b9a552acc5ca7ac8dc311b472362818779943" Nov 25 10:11:38 crc kubenswrapper[4565]: I1125 10:11:38.790625 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/crc-debug-q6b8r" Nov 25 10:11:39 crc kubenswrapper[4565]: I1125 10:11:39.107823 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4f21b5-182c-4470-9726-68f870ff0e51" path="/var/lib/kubelet/pods/ab4f21b5-182c-4470-9726-68f870ff0e51/volumes" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.817236 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:11:46 crc kubenswrapper[4565]: E1125 10:11:46.818242 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4f21b5-182c-4470-9726-68f870ff0e51" containerName="container-00" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.818256 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4f21b5-182c-4470-9726-68f870ff0e51" containerName="container-00" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.818734 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4f21b5-182c-4470-9726-68f870ff0e51" containerName="container-00" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.820267 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.823551 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.930565 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6mn9\" (UniqueName: \"kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.930641 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:46 crc kubenswrapper[4565]: I1125 10:11:46.930683 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.032686 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6mn9\" (UniqueName: \"kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.032762 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.032799 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.033325 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.033809 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.057074 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6mn9\" (UniqueName: \"kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9\") pod \"redhat-operators-9zkcw\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.140236 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.639944 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.869620 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerID="03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613" exitCode=0 Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.869783 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerDied","Data":"03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613"} Nov 25 10:11:47 crc kubenswrapper[4565]: I1125 10:11:47.869887 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerStarted","Data":"59e8c8d237a7650e2bd81d2b9fc7e823d124c20e627024a514f00c01efebf670"} Nov 25 10:11:49 crc kubenswrapper[4565]: I1125 10:11:49.895867 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerStarted","Data":"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa"} Nov 25 10:11:50 crc kubenswrapper[4565]: I1125 10:11:50.934410 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerID="03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa" exitCode=0 Nov 25 10:11:50 crc kubenswrapper[4565]: I1125 10:11:50.934687 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerDied","Data":"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa"} Nov 25 10:11:52 crc kubenswrapper[4565]: I1125 10:11:52.961056 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerStarted","Data":"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0"} Nov 25 10:11:53 crc kubenswrapper[4565]: I1125 10:11:53.002908 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9zkcw" podStartSLOduration=3.460787259 podStartE2EDuration="7.002878165s" podCreationTimestamp="2025-11-25 10:11:46 +0000 UTC" firstStartedPulling="2025-11-25 10:11:47.872167125 +0000 UTC m=+4041.074662262" lastFinishedPulling="2025-11-25 10:11:51.414258029 +0000 UTC m=+4044.616753168" observedRunningTime="2025-11-25 10:11:52.981990154 +0000 UTC m=+4046.184485292" watchObservedRunningTime="2025-11-25 10:11:53.002878165 +0000 UTC m=+4046.205373303" Nov 25 10:11:55 crc kubenswrapper[4565]: I1125 10:11:55.100006 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:11:55 crc kubenswrapper[4565]: I1125 10:11:55.100471 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:11:57 crc kubenswrapper[4565]: I1125 10:11:57.141699 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:57 crc kubenswrapper[4565]: I1125 10:11:57.141753 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:11:58 crc kubenswrapper[4565]: I1125 10:11:58.180467 4565 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9zkcw" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="registry-server" probeResult="failure" output=< Nov 25 10:11:58 crc kubenswrapper[4565]: timeout: failed to connect service ":50051" within 1s Nov 25 10:11:58 crc kubenswrapper[4565]: > Nov 25 10:12:07 crc kubenswrapper[4565]: I1125 10:12:07.194851 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:12:07 crc kubenswrapper[4565]: I1125 10:12:07.250545 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:12:07 crc kubenswrapper[4565]: I1125 10:12:07.432779 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.132377 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9zkcw" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="registry-server" containerID="cri-o://a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0" gracePeriod=2 Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.680869 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.711139 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities\") pod \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.711493 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6mn9\" (UniqueName: \"kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9\") pod \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.711682 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content\") pod \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\" (UID: \"6b53fe19-e26e-4ad2-a8fb-71de02002ef1\") " Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.711819 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities" (OuterVolumeSpecName: "utilities") pod "6b53fe19-e26e-4ad2-a8fb-71de02002ef1" (UID: "6b53fe19-e26e-4ad2-a8fb-71de02002ef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.712534 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.718787 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9" (OuterVolumeSpecName: "kube-api-access-t6mn9") pod "6b53fe19-e26e-4ad2-a8fb-71de02002ef1" (UID: "6b53fe19-e26e-4ad2-a8fb-71de02002ef1"). InnerVolumeSpecName "kube-api-access-t6mn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.817371 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6mn9\" (UniqueName: \"kubernetes.io/projected/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-kube-api-access-t6mn9\") on node \"crc\" DevicePath \"\"" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.853956 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b53fe19-e26e-4ad2-a8fb-71de02002ef1" (UID: "6b53fe19-e26e-4ad2-a8fb-71de02002ef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:12:09 crc kubenswrapper[4565]: I1125 10:12:09.919426 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b53fe19-e26e-4ad2-a8fb-71de02002ef1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.145713 4565 generic.go:334] "Generic (PLEG): container finished" podID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerID="a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0" exitCode=0 Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.145796 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerDied","Data":"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0"} Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.145831 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zkcw" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.145864 4565 scope.go:117] "RemoveContainer" containerID="a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.145846 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zkcw" event={"ID":"6b53fe19-e26e-4ad2-a8fb-71de02002ef1","Type":"ContainerDied","Data":"59e8c8d237a7650e2bd81d2b9fc7e823d124c20e627024a514f00c01efebf670"} Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.166544 4565 scope.go:117] "RemoveContainer" containerID="03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.185244 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.192259 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9zkcw"] Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.428174 4565 scope.go:117] "RemoveContainer" containerID="03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.460380 4565 scope.go:117] "RemoveContainer" containerID="a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0" Nov 25 10:12:10 crc kubenswrapper[4565]: E1125 10:12:10.462666 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0\": container with ID starting with a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0 not found: ID does not exist" containerID="a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.462738 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0"} err="failed to get container status \"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0\": rpc error: code = NotFound desc = could not find container \"a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0\": container with ID starting with a12b170113601cdbd6a496ad2a07f06b7a8552f7653ba22068f7827e6026bea0 not found: ID does not exist" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.462777 4565 scope.go:117] "RemoveContainer" containerID="03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa" Nov 25 10:12:10 crc kubenswrapper[4565]: E1125 10:12:10.466193 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa\": container with ID starting with 03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa not found: ID does not exist" containerID="03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.466248 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa"} err="failed to get container status \"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa\": rpc error: code = NotFound desc = could not find container \"03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa\": container with ID starting with 03e3583951450d78838c95f0e0e02c9c4e8bfc42806bc7d135bafbc0387698aa not found: ID does not exist" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.466285 4565 scope.go:117] "RemoveContainer" containerID="03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613" Nov 25 10:12:10 crc kubenswrapper[4565]: E1125 10:12:10.466707 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613\": container with ID starting with 03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613 not found: ID does not exist" containerID="03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613" Nov 25 10:12:10 crc kubenswrapper[4565]: I1125 10:12:10.466780 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613"} err="failed to get container status \"03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613\": rpc error: code = NotFound desc = could not find container \"03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613\": container with ID starting with 03101c8cfa77173ffb3c874af437551aa0632a015cad60f5586d959577b15613 not found: ID does not exist" Nov 25 10:12:11 crc kubenswrapper[4565]: I1125 10:12:11.107992 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" path="/var/lib/kubelet/pods/6b53fe19-e26e-4ad2-a8fb-71de02002ef1/volumes" Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.099147 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.099633 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.108390 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.109023 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.109079 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59" gracePeriod=600 Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.297574 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59" exitCode=0 Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.297859 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59"} Nov 25 10:12:25 crc kubenswrapper[4565]: I1125 10:12:25.297900 4565 scope.go:117] "RemoveContainer" containerID="49a5fa692c135dc439224df449040b0ec7e661a80367d33f3f9005181542549c" Nov 25 10:12:26 crc kubenswrapper[4565]: I1125 10:12:26.310052 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02"} Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.194088 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b4c469f64-4jfxz_b577976e-309e-47cd-80a6-4f72547d912b/barbican-api/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.252499 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b4c469f64-4jfxz_b577976e-309e-47cd-80a6-4f72547d912b/barbican-api-log/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.423306 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58b56d8886-ghgt5_b3377fa1-4b60-4b8c-9efb-a266d872af91/barbican-keystone-listener/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.473039 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58b56d8886-ghgt5_b3377fa1-4b60-4b8c-9efb-a266d872af91/barbican-keystone-listener-log/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.525999 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66898497b9-mtk55_5ff2dc5b-9c32-43bc-b8c7-7341812d4160/barbican-worker/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.655306 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66898497b9-mtk55_5ff2dc5b-9c32-43bc-b8c7-7341812d4160/barbican-worker-log/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.807531 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2vr7f_3de99810-1335-466f-8386-e4ecbe49f3fd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:38 crc kubenswrapper[4565]: I1125 10:12:38.944577 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/ceilometer-central-agent/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.103539 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/proxy-httpd/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.121698 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/sg-core/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.156460 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_57d9524a-c577-4af2-a063-06c4928a3505/ceilometer-notification-agent/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.322658 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g9l7v_6fcb5ee3-1258-4245-bd22-5aecd14a312c/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.351119 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-rqcxx_d22f3d2a-b6ff-4188-9778-e7108dd44f3a/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.613573 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd508fe3-6c52-4b41-a308-7f8697523a81/cinder-api/0.log" Nov 25 10:12:39 crc kubenswrapper[4565]: I1125 10:12:39.643365 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd508fe3-6c52-4b41-a308-7f8697523a81/cinder-api-log/0.log" Nov 25 10:12:40 crc kubenswrapper[4565]: I1125 10:12:40.640838 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d40080b6-5cb6-48e6-9625-9c8b821ed10b/cinder-backup/0.log" Nov 25 10:12:40 crc kubenswrapper[4565]: I1125 10:12:40.663015 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d40080b6-5cb6-48e6-9625-9c8b821ed10b/probe/0.log" Nov 25 10:12:40 crc kubenswrapper[4565]: I1125 10:12:40.879919 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_192bb64d-39b3-4dad-a57b-65afe8c7ec7e/cinder-scheduler/0.log" Nov 25 10:12:41 crc kubenswrapper[4565]: I1125 10:12:41.116537 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_192bb64d-39b3-4dad-a57b-65afe8c7ec7e/probe/0.log" Nov 25 10:12:41 crc kubenswrapper[4565]: I1125 10:12:41.232083 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3b9c251-0777-4463-916e-b6712e7a69b7/probe/0.log" Nov 25 10:12:41 crc kubenswrapper[4565]: I1125 10:12:41.281547 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3b9c251-0777-4463-916e-b6712e7a69b7/cinder-volume/0.log" Nov 25 10:12:41 crc kubenswrapper[4565]: I1125 10:12:41.434538 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lggx8_13f5faf5-45eb-46fc-b76b-59b8babba10c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.059706 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jq72p_49dcd4ab-323d-4499-97a6-69fe4e29a0a6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.087211 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/init/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.331167 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/init/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.365556 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e1ee609b-48dd-4e92-9b13-e9bf94768ead/glance-httpd/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.450694 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e1ee609b-48dd-4e92-9b13-e9bf94768ead/glance-log/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.686719 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_451d4f5d-1ecc-4633-a889-ea95473bc981/glance-log/0.log" Nov 25 10:12:42 crc kubenswrapper[4565]: I1125 10:12:42.720574 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_451d4f5d-1ecc-4633-a889-ea95473bc981/glance-httpd/0.log" Nov 25 10:12:43 crc kubenswrapper[4565]: I1125 10:12:43.195450 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-txnr9_e277484b-b7d0-4a20-9551-a4d62a9720ea/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:43 crc kubenswrapper[4565]: I1125 10:12:43.241023 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6549bb6ccb-qd7ll_d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb/horizon/0.log" Nov 25 10:12:43 crc kubenswrapper[4565]: I1125 10:12:43.483579 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6549bb6ccb-qd7ll_d054743a-38ab-4e7b-8ad9-c8f07fe9f7cb/horizon-log/0.log" Nov 25 10:12:43 crc kubenswrapper[4565]: I1125 10:12:43.650295 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-h9t4j_766382aa-dcdb-41f0-afb0-cb90d5ea8f31/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:44 crc kubenswrapper[4565]: I1125 10:12:44.010893 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29401081-lh62g_3749c5c9-d117-42da-bd30-bb593c5d1fb2/keystone-cron/0.log" Nov 25 10:12:44 crc kubenswrapper[4565]: I1125 10:12:44.073905 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-757f65c548-72sgl_187eb1ab-d8ed-4506-bc95-59cb1f61e285/keystone-api/0.log" Nov 25 10:12:44 crc kubenswrapper[4565]: I1125 10:12:44.508102 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5275621d-5c51-4586-85f2-e0e24cb32266/kube-state-metrics/3.log" Nov 25 10:12:44 crc kubenswrapper[4565]: I1125 10:12:44.608703 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5275621d-5c51-4586-85f2-e0e24cb32266/kube-state-metrics/2.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.018288 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xqn99_e1061e52-8553-4932-9689-83016e2b413f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.024242 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f7c9fd6c5-k87xr_15b78aa8-e609-4509-bc5e-4ec6fa67dd57/dnsmasq-dns/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.140998 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbdf37e4-2636-44d8-981a-4c960e37799e/manila-api/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.350496 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbdf37e4-2636-44d8-981a-4c960e37799e/manila-api-log/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.354167 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fef203b-c8bb-4fb3-9415-b042e6837bff/probe/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.369132 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fef203b-c8bb-4fb3-9415-b042e6837bff/manila-scheduler/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.547992 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_60d980f4-b1c6-4991-ae13-143dcb6bf453/probe/0.log" Nov 25 10:12:45 crc kubenswrapper[4565]: I1125 10:12:45.752953 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_60d980f4-b1c6-4991-ae13-143dcb6bf453/manila-share/0.log" Nov 25 10:12:46 crc kubenswrapper[4565]: I1125 10:12:46.038681 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4d697775-b8wbb_aca1b06e-7a63-4a42-873b-427de57cef7f/neutron-api/0.log" Nov 25 10:12:46 crc kubenswrapper[4565]: I1125 10:12:46.114970 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4d697775-b8wbb_aca1b06e-7a63-4a42-873b-427de57cef7f/neutron-httpd/0.log" Nov 25 10:12:46 crc kubenswrapper[4565]: I1125 10:12:46.138636 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bwbsw_f5a1f544-cbde-40e4-aec7-72347718b75d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:46 crc kubenswrapper[4565]: I1125 10:12:46.806575 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_13ff22ce-a715-4dba-aaa3-b6ba3a929d55/nova-api-log/0.log" Nov 25 10:12:46 crc kubenswrapper[4565]: I1125 10:12:46.823809 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ba666b44-183c-4752-8f43-899a921da911/nova-cell0-conductor-conductor/0.log" Nov 25 10:12:47 crc kubenswrapper[4565]: I1125 10:12:47.250319 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_13ff22ce-a715-4dba-aaa3-b6ba3a929d55/nova-api-api/0.log" Nov 25 10:12:47 crc kubenswrapper[4565]: I1125 10:12:47.278018 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_26443d78-c2f9-4e62-9f77-69dbca9848f0/nova-cell1-conductor-conductor/0.log" Nov 25 10:12:47 crc kubenswrapper[4565]: I1125 10:12:47.339482 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_42ef1e95-a44e-4dea-8127-228bd8065e0c/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.059371 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2bqqm_769230ff-fe55-4c62-bc60-73797b5fc1bb/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.123313 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1e090c67-47d1-445e-843f-4cf950699016/nova-metadata-log/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.628197 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/mysql-bootstrap/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.670829 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c528e51d-34ba-402c-8985-b53beb776f43/nova-scheduler-scheduler/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.926540 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/galera/0.log" Nov 25 10:12:48 crc kubenswrapper[4565]: I1125 10:12:48.945165 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_af73acca-d67e-47fe-89ff-70f865731045/mysql-bootstrap/0.log" Nov 25 10:12:49 crc kubenswrapper[4565]: I1125 10:12:49.207214 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/mysql-bootstrap/0.log" Nov 25 10:12:49 crc kubenswrapper[4565]: I1125 10:12:49.662679 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1e090c67-47d1-445e-843f-4cf950699016/nova-metadata-metadata/0.log" Nov 25 10:12:49 crc kubenswrapper[4565]: I1125 10:12:49.789805 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/mysql-bootstrap/0.log" Nov 25 10:12:49 crc kubenswrapper[4565]: I1125 10:12:49.881736 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_80ebb406-0240-4ba6-86f1-177776f19865/galera/0.log" Nov 25 10:12:49 crc kubenswrapper[4565]: I1125 10:12:49.982766 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7dfd09a5-8627-4394-ac4f-367458ffe0b2/openstackclient/0.log" Nov 25 10:12:50 crc kubenswrapper[4565]: I1125 10:12:50.286690 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7k468_67e2fa61-acc9-415b-9e10-0a35b6a3feb7/ovn-controller/0.log" Nov 25 10:12:50 crc kubenswrapper[4565]: I1125 10:12:50.304851 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ct2r6_1d48eb0e-12fc-4722-a8cc-cfd703dbc1c9/openstack-network-exporter/0.log" Nov 25 10:12:50 crc kubenswrapper[4565]: I1125 10:12:50.526596 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server-init/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.123401 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.195104 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovsdb-server-init/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.236494 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qhxwx_702c3b01-501a-42d1-a945-603af0fbd306/ovs-vswitchd/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.602388 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abfe2157-f884-4325-8d80-7fa9b90754a9/ovn-northd/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.665834 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-m2rrn_eedd2b64-c2c0-43dd-a5d9-ee7508387909/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.713208 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abfe2157-f884-4325-8d80-7fa9b90754a9/openstack-network-exporter/0.log" Nov 25 10:12:51 crc kubenswrapper[4565]: I1125 10:12:51.982153 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_68861e53-c198-4971-baf5-dd1653ef84ad/ovsdbserver-nb/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.028805 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_68861e53-c198-4971-baf5-dd1653ef84ad/openstack-network-exporter/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.192793 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_164519a3-6eaf-49ac-bc20-cd1a4b04d594/openstack-network-exporter/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.301731 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_164519a3-6eaf-49ac-bc20-cd1a4b04d594/ovsdbserver-sb/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.576669 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b84f9bd8-pdssf_7d87b83a-02e7-48d5-8f87-e127fe8ffe0b/placement-api/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.615454 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b84f9bd8-pdssf_7d87b83a-02e7-48d5-8f87-e127fe8ffe0b/placement-log/0.log" Nov 25 10:12:52 crc kubenswrapper[4565]: I1125 10:12:52.790965 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/setup-container/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.064826 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/rabbitmq/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.131294 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9914fdc4-3539-4d6b-97cf-e4c5330acfc0/setup-container/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.142601 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/setup-container/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.443146 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/setup-container/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.512012 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49a91aac-079e-475b-ac75-f400d2081405/rabbitmq/0.log" Nov 25 10:12:53 crc kubenswrapper[4565]: I1125 10:12:53.545385 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xrjw7_dccca567-2d50-4077-8a64-803dafa14ffb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:54 crc kubenswrapper[4565]: I1125 10:12:54.099186 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vgsgw_e29506cc-593c-49fd-b8eb-0ec1c0c8be5b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:54 crc kubenswrapper[4565]: I1125 10:12:54.125808 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kl4bm_41dfaf03-cd54-49f2-9f4c-4dcf954d8d6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:12:54 crc kubenswrapper[4565]: I1125 10:12:54.374907 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x54cf_ff542e2a-3788-42a5-8a29-66f22838511d/ssh-known-hosts-edpm-deployment/0.log" Nov 25 10:12:54 crc kubenswrapper[4565]: I1125 10:12:54.528237 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a3071a8a-a30b-4b2b-aea0-5882f4eff1b2/tempest-tests-tempest-tests-runner/0.log" Nov 25 10:12:54 crc kubenswrapper[4565]: I1125 10:12:54.830867 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1a5fdee8-1424-4227-a297-3c68d5463280/test-operator-logs-container/0.log" Nov 25 10:12:55 crc kubenswrapper[4565]: I1125 10:12:55.032103 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-f5dm9_7b9ebd21-0421-42f3-a7e6-8f0038b8c07e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 10:13:06 crc kubenswrapper[4565]: I1125 10:13:06.206021 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_26db83c6-ee58-44da-bcb6-16989b77fba4/memcached/0.log" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.214228 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:12 crc kubenswrapper[4565]: E1125 10:13:12.215187 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="extract-utilities" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.215203 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="extract-utilities" Nov 25 10:13:12 crc kubenswrapper[4565]: E1125 10:13:12.215219 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="extract-content" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.215225 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="extract-content" Nov 25 10:13:12 crc kubenswrapper[4565]: E1125 10:13:12.215247 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="registry-server" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.215255 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="registry-server" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.215450 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b53fe19-e26e-4ad2-a8fb-71de02002ef1" containerName="registry-server" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.216684 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.282826 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.283099 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.283320 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcrkc\" (UniqueName: \"kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.347649 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.398733 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcrkc\" (UniqueName: \"kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.398804 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.398905 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.399334 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.399868 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.423675 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcrkc\" (UniqueName: \"kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc\") pod \"certified-operators-f9njk\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:12 crc kubenswrapper[4565]: I1125 10:13:12.541081 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:13 crc kubenswrapper[4565]: I1125 10:13:13.065808 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:13 crc kubenswrapper[4565]: I1125 10:13:13.829529 4565 generic.go:334] "Generic (PLEG): container finished" podID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerID="a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7" exitCode=0 Nov 25 10:13:13 crc kubenswrapper[4565]: I1125 10:13:13.829589 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerDied","Data":"a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7"} Nov 25 10:13:13 crc kubenswrapper[4565]: I1125 10:13:13.829870 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerStarted","Data":"e6860c590abb173e83174268a9cf56722892edf49f098048e135bcae682c4d70"} Nov 25 10:13:14 crc kubenswrapper[4565]: I1125 10:13:14.841170 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerStarted","Data":"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df"} Nov 25 10:13:15 crc kubenswrapper[4565]: I1125 10:13:15.850296 4565 generic.go:334] "Generic (PLEG): container finished" podID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerID="d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df" exitCode=0 Nov 25 10:13:15 crc kubenswrapper[4565]: I1125 10:13:15.850494 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerDied","Data":"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df"} Nov 25 10:13:17 crc kubenswrapper[4565]: I1125 10:13:17.868349 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerStarted","Data":"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d"} Nov 25 10:13:17 crc kubenswrapper[4565]: I1125 10:13:17.901809 4565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9njk" podStartSLOduration=3.359065333 podStartE2EDuration="5.901791329s" podCreationTimestamp="2025-11-25 10:13:12 +0000 UTC" firstStartedPulling="2025-11-25 10:13:13.832359627 +0000 UTC m=+4127.034854765" lastFinishedPulling="2025-11-25 10:13:16.375085623 +0000 UTC m=+4129.577580761" observedRunningTime="2025-11-25 10:13:17.89319696 +0000 UTC m=+4131.095692098" watchObservedRunningTime="2025-11-25 10:13:17.901791329 +0000 UTC m=+4131.104286467" Nov 25 10:13:22 crc kubenswrapper[4565]: I1125 10:13:22.541551 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:22 crc kubenswrapper[4565]: I1125 10:13:22.542290 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:22 crc kubenswrapper[4565]: I1125 10:13:22.586175 4565 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:22 crc kubenswrapper[4565]: I1125 10:13:22.973026 4565 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:23 crc kubenswrapper[4565]: I1125 10:13:23.027370 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:24 crc kubenswrapper[4565]: I1125 10:13:24.942784 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9njk" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="registry-server" containerID="cri-o://a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d" gracePeriod=2 Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.489906 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.605948 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content\") pod \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.606114 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities\") pod \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.606360 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcrkc\" (UniqueName: \"kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc\") pod \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\" (UID: \"4a54946e-9de2-408e-93ce-1dd9bf48f5ad\") " Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.606785 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities" (OuterVolumeSpecName: "utilities") pod "4a54946e-9de2-408e-93ce-1dd9bf48f5ad" (UID: "4a54946e-9de2-408e-93ce-1dd9bf48f5ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.607318 4565 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.613093 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc" (OuterVolumeSpecName: "kube-api-access-qcrkc") pod "4a54946e-9de2-408e-93ce-1dd9bf48f5ad" (UID: "4a54946e-9de2-408e-93ce-1dd9bf48f5ad"). InnerVolumeSpecName "kube-api-access-qcrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.648597 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a54946e-9de2-408e-93ce-1dd9bf48f5ad" (UID: "4a54946e-9de2-408e-93ce-1dd9bf48f5ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.709538 4565 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.709569 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcrkc\" (UniqueName: \"kubernetes.io/projected/4a54946e-9de2-408e-93ce-1dd9bf48f5ad-kube-api-access-qcrkc\") on node \"crc\" DevicePath \"\"" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.954338 4565 generic.go:334] "Generic (PLEG): container finished" podID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerID="a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d" exitCode=0 Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.954432 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9njk" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.954451 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerDied","Data":"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d"} Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.954712 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9njk" event={"ID":"4a54946e-9de2-408e-93ce-1dd9bf48f5ad","Type":"ContainerDied","Data":"e6860c590abb173e83174268a9cf56722892edf49f098048e135bcae682c4d70"} Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.954734 4565 scope.go:117] "RemoveContainer" containerID="a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.985868 4565 scope.go:117] "RemoveContainer" containerID="d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df" Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.990864 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:25 crc kubenswrapper[4565]: I1125 10:13:25.999440 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9njk"] Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.013911 4565 scope.go:117] "RemoveContainer" containerID="a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.046288 4565 scope.go:117] "RemoveContainer" containerID="a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d" Nov 25 10:13:26 crc kubenswrapper[4565]: E1125 10:13:26.046727 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d\": container with ID starting with a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d not found: ID does not exist" containerID="a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.046759 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d"} err="failed to get container status \"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d\": rpc error: code = NotFound desc = could not find container \"a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d\": container with ID starting with a39747f72ff66f55125c4d2ca5d98a4da45f89276582ae47738cec22d9cb048d not found: ID does not exist" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.046785 4565 scope.go:117] "RemoveContainer" containerID="d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df" Nov 25 10:13:26 crc kubenswrapper[4565]: E1125 10:13:26.047268 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df\": container with ID starting with d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df not found: ID does not exist" containerID="d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.047310 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df"} err="failed to get container status \"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df\": rpc error: code = NotFound desc = could not find container \"d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df\": container with ID starting with d3b20af945f8753c79a902c6974b513731335af19d8d4e62c139e9a41b4458df not found: ID does not exist" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.047339 4565 scope.go:117] "RemoveContainer" containerID="a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7" Nov 25 10:13:26 crc kubenswrapper[4565]: E1125 10:13:26.047680 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7\": container with ID starting with a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7 not found: ID does not exist" containerID="a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7" Nov 25 10:13:26 crc kubenswrapper[4565]: I1125 10:13:26.047712 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7"} err="failed to get container status \"a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7\": rpc error: code = NotFound desc = could not find container \"a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7\": container with ID starting with a8141a25fb94286dd57169d0a77bce6a61ef7cc4f5c631fc56d9db7dc26a36a7 not found: ID does not exist" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.108864 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" path="/var/lib/kubelet/pods/4a54946e-9de2-408e-93ce-1dd9bf48f5ad/volumes" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.196038 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/manager/3.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.198892 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/kube-rbac-proxy/0.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.610774 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-cxwrc_873884b1-6ee8-400c-9ca2-0b0b3c4618e9/manager/2.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.620109 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.813209 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.851896 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:13:27 crc kubenswrapper[4565]: I1125 10:13:27.876002 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.126234 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/pull/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.142019 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/extract/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.154960 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fdtcrj_62937ebb-ead0-4d96-b186-9dfcc8967ec0/util/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.299639 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/kube-rbac-proxy/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.385756 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/manager/3.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.427451 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-ddlth_1af57713-55c3-45ec-b98b-1aac75a2d60b/manager/2.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.570257 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/manager/3.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.572105 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/kube-rbac-proxy/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.697645 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-t68ww_a933a688-5393-4b7b-b0b7-6ee5791970b1/manager/2.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.764004 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/kube-rbac-proxy/0.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.807372 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/manager/3.log" Nov 25 10:13:28 crc kubenswrapper[4565]: I1125 10:13:28.930386 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-f9bbj_92be75e0-b60b-4f41-bde1-4f74a4d306e3/manager/2.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.377709 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/kube-rbac-proxy/0.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.425078 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/manager/3.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.468140 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-bd8d6_93da1f7e-c5e8-4c9c-b6af-feb85c526b47/manager/2.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.782745 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/kube-rbac-proxy/0.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.873802 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/manager/2.log" Nov 25 10:13:29 crc kubenswrapper[4565]: I1125 10:13:29.891237 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-2s9lf_354fe5db-35d0-4d94-989c-02a077f8bd20/manager/3.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.078381 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/kube-rbac-proxy/0.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.116166 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/manager/3.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.142511 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-2q9rf_333ae034-2972-4915-a547-364c01510827/manager/2.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.260234 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/kube-rbac-proxy/0.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.303832 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/manager/3.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.373895 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-mjsqx_6402fac4-067f-4410-a00c-0d438d502f3c/manager/2.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.499298 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/manager/3.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.523435 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/kube-rbac-proxy/0.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.581564 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-pcqxq_d5be161b-0f0c-485e-b1c7-50a9fff4b053/manager/2.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.734886 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/kube-rbac-proxy/0.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.788486 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/manager/3.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.794920 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-lz6zt_cf68120a-e894-4189-8035-91f8045618c0/manager/2.log" Nov 25 10:13:30 crc kubenswrapper[4565]: I1125 10:13:30.978439 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/kube-rbac-proxy/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.000183 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/manager/3.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.051320 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-2gkww_4ee66804-213d-4e52-b04b-6b00eec8de2d/manager/2.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.171875 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/kube-rbac-proxy/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.174463 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/manager/3.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.256310 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-pzd74_d0ef0237-045a-4153-a377-07b2c9e6ceba/manager/2.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.324972 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/kube-rbac-proxy/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.414909 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/manager/3.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.450408 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n9bdd_f2c67417-c283-4158-91ec-f49478a5378e/manager/2.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.607723 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/kube-rbac-proxy/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.617226 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/manager/3.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.676777 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-hrr6t_6279e5b8-cc23-4b43-9554-754a61174bcd/manager/2.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.776798 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/kube-rbac-proxy/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.844387 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/manager/0.log" Nov 25 10:13:31 crc kubenswrapper[4565]: I1125 10:13:31.909726 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-sw4l6_d4a03edc-1b0f-4f50-bab7-b2292c453f4d/manager/1.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.036982 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-fkc7l_579400cf-d71f-47f4-a98e-b94ccbf4ff72/manager/1.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.299805 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-s8c4s_0c32d371-4207-4e71-8031-a27b6562f9a2/operator/1.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.487023 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-s8c4s_0c32d371-4207-4e71-8031-a27b6562f9a2/operator/0.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.512473 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-fkc7l_579400cf-d71f-47f4-a98e-b94ccbf4ff72/manager/2.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.524464 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2hp6w_4b5856eb-4d4d-406d-bb20-cbc44a10e522/registry-server/0.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.623283 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/kube-rbac-proxy/0.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.680803 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/manager/2.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.686843 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-zz6wf_052c7786-4d54-4af0-8598-91ff09cdf966/manager/3.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.786295 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/kube-rbac-proxy/0.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.851997 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/manager/3.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.898206 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-kgn59_31dbf471-6fab-4ddd-a384-e4dd5335d5dc/manager/2.log" Nov 25 10:13:32 crc kubenswrapper[4565]: I1125 10:13:32.898659 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4llp_a65931e1-7a1f-4251-9c4f-996b407dfb03/operator/3.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.021079 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4llp_a65931e1-7a1f-4251-9c4f-996b407dfb03/operator/2.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.141161 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/kube-rbac-proxy/0.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.176347 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/manager/2.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.191887 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-zl2jr_f35f4446-328e-40d3-96d6-2bc814fb8a96/manager/3.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.342726 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/kube-rbac-proxy/0.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.376871 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/manager/3.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.416745 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/kube-rbac-proxy/0.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.435555 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-7dzx4_1ef630cb-2220-41f5-8a3d-66a2a78ce0ce/manager/2.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.636212 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/manager/1.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.646488 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-sj4j7_cbdce822-eeeb-448b-9f3b-46fdf9e9b43d/manager/0.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.659071 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/kube-rbac-proxy/0.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.719464 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/manager/3.log" Nov 25 10:13:33 crc kubenswrapper[4565]: I1125 10:13:33.833728 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-v2c96_3791b99a-d877-470f-8a8f-56f7b02be997/manager/2.log" Nov 25 10:13:51 crc kubenswrapper[4565]: I1125 10:13:51.455096 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-smktm_00c23670-fc21-4730-a27e-ac490261f994/control-plane-machine-set-operator/0.log" Nov 25 10:13:51 crc kubenswrapper[4565]: I1125 10:13:51.721444 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8xkx9_262e9d06-4e95-4017-85d9-5657f520eb49/kube-rbac-proxy/0.log" Nov 25 10:13:51 crc kubenswrapper[4565]: I1125 10:13:51.730991 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8xkx9_262e9d06-4e95-4017-85d9-5657f520eb49/machine-api-operator/0.log" Nov 25 10:14:03 crc kubenswrapper[4565]: I1125 10:14:03.780364 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-h4n5f_bac4df7d-2428-4150-881b-5695b1cfbddd/cert-manager-controller/0.log" Nov 25 10:14:03 crc kubenswrapper[4565]: I1125 10:14:03.907242 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-fqzsz_9500e97b-07b7-43d8-bfdf-dab609ce7f67/cert-manager-cainjector/0.log" Nov 25 10:14:03 crc kubenswrapper[4565]: I1125 10:14:03.990570 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-vpzsg_be96e081-d820-40aa-81e9-bff6c2392110/cert-manager-webhook/0.log" Nov 25 10:14:16 crc kubenswrapper[4565]: I1125 10:14:16.248453 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-d25tx_538d2898-95e6-4651-89d1-d5cb979d7aab/nmstate-console-plugin/0.log" Nov 25 10:14:16 crc kubenswrapper[4565]: I1125 10:14:16.361251 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b57v9_26159181-25a7-4f96-8bf7-059faaff18e0/nmstate-handler/0.log" Nov 25 10:14:16 crc kubenswrapper[4565]: I1125 10:14:16.453035 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-gfhjf_5745386c-25f6-4be7-bdc7-c299e25185d4/kube-rbac-proxy/0.log" Nov 25 10:14:16 crc kubenswrapper[4565]: I1125 10:14:16.528597 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-gfhjf_5745386c-25f6-4be7-bdc7-c299e25185d4/nmstate-metrics/0.log" Nov 25 10:14:16 crc kubenswrapper[4565]: I1125 10:14:16.969952 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-xvfww_63b01418-682e-4ebe-874d-aab5928c222a/nmstate-operator/0.log" Nov 25 10:14:17 crc kubenswrapper[4565]: I1125 10:14:17.024303 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-bgc56_e12b0906-c6a2-468a-8bc1-a29bda6a25e3/nmstate-webhook/0.log" Nov 25 10:14:25 crc kubenswrapper[4565]: I1125 10:14:25.099405 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:14:25 crc kubenswrapper[4565]: I1125 10:14:25.100105 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:14:31 crc kubenswrapper[4565]: I1125 10:14:31.514110 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-gzvgq_2d23227d-3456-4a34-9aa7-4878c7ee4d37/kube-rbac-proxy/0.log" Nov 25 10:14:31 crc kubenswrapper[4565]: I1125 10:14:31.759338 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-gzvgq_2d23227d-3456-4a34-9aa7-4878c7ee4d37/controller/0.log" Nov 25 10:14:31 crc kubenswrapper[4565]: I1125 10:14:31.808269 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.011447 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.073981 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.079912 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.114195 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.324208 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.342665 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.370533 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.383220 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.580549 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-frr-files/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.590337 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/controller/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.598088 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-metrics/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.616999 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/cp-reloader/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.755802 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/kube-rbac-proxy/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.818439 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/frr-metrics/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.910602 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/kube-rbac-proxy-frr/0.log" Nov 25 10:14:32 crc kubenswrapper[4565]: I1125 10:14:32.983257 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/reloader/0.log" Nov 25 10:14:33 crc kubenswrapper[4565]: I1125 10:14:33.170890 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-b722z_23dd1cf1-30ed-4fc1-9b32-70897895e05d/frr-k8s-webhook-server/0.log" Nov 25 10:14:33 crc kubenswrapper[4565]: I1125 10:14:33.389508 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74454849f9-fjwfp_145e5d59-fd78-4bc1-a97c-17ebf0d67fa4/manager/3.log" Nov 25 10:14:33 crc kubenswrapper[4565]: I1125 10:14:33.458813 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74454849f9-fjwfp_145e5d59-fd78-4bc1-a97c-17ebf0d67fa4/manager/2.log" Nov 25 10:14:33 crc kubenswrapper[4565]: I1125 10:14:33.733679 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fb8cb44b7-5dvrd_14fec6d4-6935-4283-944d-6a229b3cdc82/webhook-server/0.log" Nov 25 10:14:33 crc kubenswrapper[4565]: I1125 10:14:33.870155 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dr2xf_1d95b8b8-1675-48ae-b497-ff3fcf0bbc42/kube-rbac-proxy/0.log" Nov 25 10:14:34 crc kubenswrapper[4565]: I1125 10:14:34.028253 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82nts_2a3347c5-5075-4d8d-99fb-cd2468efe83d/frr/0.log" Nov 25 10:14:34 crc kubenswrapper[4565]: I1125 10:14:34.408248 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dr2xf_1d95b8b8-1675-48ae-b497-ff3fcf0bbc42/speaker/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.388348 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.580422 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.634709 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.639466 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.780767 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/util/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.818117 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/pull/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.847674 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772epqkc9_950c1190-c404-483f-bb4a-5a3fe7548ccf/extract/0.log" Nov 25 10:14:47 crc kubenswrapper[4565]: I1125 10:14:47.980655 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.201222 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.205013 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.216778 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.463922 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-content/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.519702 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/extract-utilities/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.771790 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.934043 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wrgj8_45e414d4-5edb-48e0-a419-2d00347fda7c/registry-server/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.974158 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:14:48 crc kubenswrapper[4565]: I1125 10:14:48.997879 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.011581 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.180414 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-content/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.182838 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/extract-utilities/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.412462 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.782406 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dq959_37044fab-4a08-4d4f-a2d4-9da1a0eb5127/registry-server/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.852011 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.871949 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:14:49 crc kubenswrapper[4565]: I1125 10:14:49.907041 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.143537 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/util/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.200565 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/extract/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.256241 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6hz92m_f291e538-ed14-4652-bf08-0b52ac487353/pull/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.374780 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rgcml_5957e9ea-c2fe-43cb-9318-e22ae96c689c/marketplace-operator/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.479214 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.646862 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.659735 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.696278 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.886408 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-utilities/0.log" Nov 25 10:14:50 crc kubenswrapper[4565]: I1125 10:14:50.907751 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/extract-content/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.094306 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mnphj_849cf673-c6d9-4372-a23d-96e04c71a796/registry-server/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.179072 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.352164 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.356661 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.374734 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.528455 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-utilities/0.log" Nov 25 10:14:51 crc kubenswrapper[4565]: I1125 10:14:51.605532 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/extract-content/0.log" Nov 25 10:14:52 crc kubenswrapper[4565]: I1125 10:14:52.090363 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pwhqj_23308127-bd19-4001-80f1-9cc93c692984/registry-server/0.log" Nov 25 10:14:55 crc kubenswrapper[4565]: I1125 10:14:55.099104 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:14:55 crc kubenswrapper[4565]: I1125 10:14:55.099460 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.161917 4565 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2"] Nov 25 10:15:00 crc kubenswrapper[4565]: E1125 10:15:00.162813 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="extract-utilities" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.162828 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="extract-utilities" Nov 25 10:15:00 crc kubenswrapper[4565]: E1125 10:15:00.162854 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="registry-server" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.162859 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="registry-server" Nov 25 10:15:00 crc kubenswrapper[4565]: E1125 10:15:00.162873 4565 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="extract-content" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.162881 4565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="extract-content" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.163076 4565 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a54946e-9de2-408e-93ce-1dd9bf48f5ad" containerName="registry-server" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.163699 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.165761 4565 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.166219 4565 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.182539 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2"] Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.183449 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.183600 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6b2g\" (UniqueName: \"kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.183779 4565 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.286573 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.286676 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6b2g\" (UniqueName: \"kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.286770 4565 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.287845 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.292219 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.305410 4565 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6b2g\" (UniqueName: \"kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g\") pod \"collect-profiles-29401095-2gwh2\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.481863 4565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:00 crc kubenswrapper[4565]: I1125 10:15:00.947800 4565 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2"] Nov 25 10:15:01 crc kubenswrapper[4565]: I1125 10:15:01.908939 4565 generic.go:334] "Generic (PLEG): container finished" podID="e6256f84-d85e-417d-95f9-d6df7b9f5783" containerID="d10be77d67930ef23ef828822d04510a2ac745afb6ee1d9c63a4d36d212d692b" exitCode=0 Nov 25 10:15:01 crc kubenswrapper[4565]: I1125 10:15:01.909053 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" event={"ID":"e6256f84-d85e-417d-95f9-d6df7b9f5783","Type":"ContainerDied","Data":"d10be77d67930ef23ef828822d04510a2ac745afb6ee1d9c63a4d36d212d692b"} Nov 25 10:15:01 crc kubenswrapper[4565]: I1125 10:15:01.909258 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" event={"ID":"e6256f84-d85e-417d-95f9-d6df7b9f5783","Type":"ContainerStarted","Data":"e890e6d48100b243990b4dabb171af5aab293252cf5fdbd34fc7fc489bc741c1"} Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.408819 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.562338 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume\") pod \"e6256f84-d85e-417d-95f9-d6df7b9f5783\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.562552 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume\") pod \"e6256f84-d85e-417d-95f9-d6df7b9f5783\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.562749 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6b2g\" (UniqueName: \"kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g\") pod \"e6256f84-d85e-417d-95f9-d6df7b9f5783\" (UID: \"e6256f84-d85e-417d-95f9-d6df7b9f5783\") " Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.563358 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6256f84-d85e-417d-95f9-d6df7b9f5783" (UID: "e6256f84-d85e-417d-95f9-d6df7b9f5783"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.563549 4565 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6256f84-d85e-417d-95f9-d6df7b9f5783-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.570219 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g" (OuterVolumeSpecName: "kube-api-access-s6b2g") pod "e6256f84-d85e-417d-95f9-d6df7b9f5783" (UID: "e6256f84-d85e-417d-95f9-d6df7b9f5783"). InnerVolumeSpecName "kube-api-access-s6b2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.570796 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6256f84-d85e-417d-95f9-d6df7b9f5783" (UID: "e6256f84-d85e-417d-95f9-d6df7b9f5783"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.666860 4565 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6256f84-d85e-417d-95f9-d6df7b9f5783-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.666912 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6b2g\" (UniqueName: \"kubernetes.io/projected/e6256f84-d85e-417d-95f9-d6df7b9f5783-kube-api-access-s6b2g\") on node \"crc\" DevicePath \"\"" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.946280 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" event={"ID":"e6256f84-d85e-417d-95f9-d6df7b9f5783","Type":"ContainerDied","Data":"e890e6d48100b243990b4dabb171af5aab293252cf5fdbd34fc7fc489bc741c1"} Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.946367 4565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e890e6d48100b243990b4dabb171af5aab293252cf5fdbd34fc7fc489bc741c1" Nov 25 10:15:03 crc kubenswrapper[4565]: I1125 10:15:03.946493 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401095-2gwh2" Nov 25 10:15:04 crc kubenswrapper[4565]: I1125 10:15:04.495722 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x"] Nov 25 10:15:04 crc kubenswrapper[4565]: I1125 10:15:04.507779 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401050-kld9x"] Nov 25 10:15:05 crc kubenswrapper[4565]: I1125 10:15:05.109482 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f90bd7-1f21-496a-b040-3cbaed72bea6" path="/var/lib/kubelet/pods/01f90bd7-1f21-496a-b040-3cbaed72bea6/volumes" Nov 25 10:15:25 crc kubenswrapper[4565]: I1125 10:15:25.099958 4565 patch_prober.go:28] interesting pod/machine-config-daemon-r28bt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 10:15:25 crc kubenswrapper[4565]: I1125 10:15:25.100558 4565 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 10:15:25 crc kubenswrapper[4565]: I1125 10:15:25.108565 4565 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" Nov 25 10:15:25 crc kubenswrapper[4565]: I1125 10:15:25.109080 4565 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02"} pod="openshift-machine-config-operator/machine-config-daemon-r28bt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 10:15:25 crc kubenswrapper[4565]: I1125 10:15:25.109138 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerName="machine-config-daemon" containerID="cri-o://ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" gracePeriod=600 Nov 25 10:15:25 crc kubenswrapper[4565]: E1125 10:15:25.235970 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:15:26 crc kubenswrapper[4565]: I1125 10:15:26.166090 4565 generic.go:334] "Generic (PLEG): container finished" podID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" exitCode=0 Nov 25 10:15:26 crc kubenswrapper[4565]: I1125 10:15:26.166548 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerDied","Data":"ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02"} Nov 25 10:15:26 crc kubenswrapper[4565]: I1125 10:15:26.166613 4565 scope.go:117] "RemoveContainer" containerID="2c700bc6358e1ec78e7ef216340493892cce22209e9acdeedd22c23d90b0fb59" Nov 25 10:15:26 crc kubenswrapper[4565]: I1125 10:15:26.167626 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:15:26 crc kubenswrapper[4565]: E1125 10:15:26.168105 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:15:37 crc kubenswrapper[4565]: I1125 10:15:37.113293 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:15:37 crc kubenswrapper[4565]: E1125 10:15:37.115653 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:15:38 crc kubenswrapper[4565]: I1125 10:15:38.281916 4565 scope.go:117] "RemoveContainer" containerID="c7f59b994cb8ed4c077d58bf2969e4b9ac349a22768851246d6ae90272168a2e" Nov 25 10:15:48 crc kubenswrapper[4565]: I1125 10:15:48.096886 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:15:48 crc kubenswrapper[4565]: E1125 10:15:48.097630 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:16:00 crc kubenswrapper[4565]: I1125 10:16:00.096992 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:16:00 crc kubenswrapper[4565]: E1125 10:16:00.097861 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:16:12 crc kubenswrapper[4565]: I1125 10:16:12.097287 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:16:12 crc kubenswrapper[4565]: E1125 10:16:12.099231 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:16:26 crc kubenswrapper[4565]: I1125 10:16:26.097996 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:16:26 crc kubenswrapper[4565]: E1125 10:16:26.099846 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:16:41 crc kubenswrapper[4565]: I1125 10:16:41.096793 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:16:41 crc kubenswrapper[4565]: E1125 10:16:41.097600 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:16:55 crc kubenswrapper[4565]: I1125 10:16:55.097726 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:16:55 crc kubenswrapper[4565]: E1125 10:16:55.098906 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:17:03 crc kubenswrapper[4565]: I1125 10:17:03.204811 4565 generic.go:334] "Generic (PLEG): container finished" podID="5131b53e-f713-4c17-b497-89ec226e9df6" containerID="6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a" exitCode=0 Nov 25 10:17:03 crc kubenswrapper[4565]: I1125 10:17:03.205156 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" event={"ID":"5131b53e-f713-4c17-b497-89ec226e9df6","Type":"ContainerDied","Data":"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a"} Nov 25 10:17:03 crc kubenswrapper[4565]: I1125 10:17:03.206093 4565 scope.go:117] "RemoveContainer" containerID="6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a" Nov 25 10:17:03 crc kubenswrapper[4565]: I1125 10:17:03.406520 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tm2p_must-gather-ldl4x_5131b53e-f713-4c17-b497-89ec226e9df6/gather/0.log" Nov 25 10:17:06 crc kubenswrapper[4565]: I1125 10:17:06.097887 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:17:06 crc kubenswrapper[4565]: E1125 10:17:06.099644 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:17:16 crc kubenswrapper[4565]: I1125 10:17:16.361129 4565 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tm2p/must-gather-ldl4x"] Nov 25 10:17:16 crc kubenswrapper[4565]: I1125 10:17:16.363199 4565 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" podUID="5131b53e-f713-4c17-b497-89ec226e9df6" containerName="copy" containerID="cri-o://6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e" gracePeriod=2 Nov 25 10:17:16 crc kubenswrapper[4565]: I1125 10:17:16.398732 4565 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tm2p/must-gather-ldl4x"] Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.001921 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tm2p_must-gather-ldl4x_5131b53e-f713-4c17-b497-89ec226e9df6/copy/0.log" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.002775 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.044286 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqs28\" (UniqueName: \"kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28\") pod \"5131b53e-f713-4c17-b497-89ec226e9df6\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.044395 4565 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output\") pod \"5131b53e-f713-4c17-b497-89ec226e9df6\" (UID: \"5131b53e-f713-4c17-b497-89ec226e9df6\") " Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.052200 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28" (OuterVolumeSpecName: "kube-api-access-bqs28") pod "5131b53e-f713-4c17-b497-89ec226e9df6" (UID: "5131b53e-f713-4c17-b497-89ec226e9df6"). InnerVolumeSpecName "kube-api-access-bqs28". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.106291 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:17:17 crc kubenswrapper[4565]: E1125 10:17:17.106830 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.154372 4565 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqs28\" (UniqueName: \"kubernetes.io/projected/5131b53e-f713-4c17-b497-89ec226e9df6-kube-api-access-bqs28\") on node \"crc\" DevicePath \"\"" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.207840 4565 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5131b53e-f713-4c17-b497-89ec226e9df6" (UID: "5131b53e-f713-4c17-b497-89ec226e9df6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.257673 4565 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5131b53e-f713-4c17-b497-89ec226e9df6-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.388744 4565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tm2p_must-gather-ldl4x_5131b53e-f713-4c17-b497-89ec226e9df6/copy/0.log" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.389067 4565 generic.go:334] "Generic (PLEG): container finished" podID="5131b53e-f713-4c17-b497-89ec226e9df6" containerID="6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e" exitCode=143 Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.389109 4565 scope.go:117] "RemoveContainer" containerID="6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.389233 4565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tm2p/must-gather-ldl4x" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.433557 4565 scope.go:117] "RemoveContainer" containerID="6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.474535 4565 scope.go:117] "RemoveContainer" containerID="6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e" Nov 25 10:17:17 crc kubenswrapper[4565]: E1125 10:17:17.475352 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e\": container with ID starting with 6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e not found: ID does not exist" containerID="6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.475396 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e"} err="failed to get container status \"6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e\": rpc error: code = NotFound desc = could not find container \"6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e\": container with ID starting with 6ade9fdd8804f58222770c0cd2b0bcf257f9adb5e1135b38e9165b0e81328b3e not found: ID does not exist" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.475424 4565 scope.go:117] "RemoveContainer" containerID="6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a" Nov 25 10:17:17 crc kubenswrapper[4565]: E1125 10:17:17.478295 4565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a\": container with ID starting with 6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a not found: ID does not exist" containerID="6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a" Nov 25 10:17:17 crc kubenswrapper[4565]: I1125 10:17:17.478325 4565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a"} err="failed to get container status \"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a\": rpc error: code = NotFound desc = could not find container \"6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a\": container with ID starting with 6d85d3b3f57b6ed4964b0cd393cb14358df3a7b1221293bf7ee34b446e738b7a not found: ID does not exist" Nov 25 10:17:19 crc kubenswrapper[4565]: I1125 10:17:19.109402 4565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5131b53e-f713-4c17-b497-89ec226e9df6" path="/var/lib/kubelet/pods/5131b53e-f713-4c17-b497-89ec226e9df6/volumes" Nov 25 10:17:28 crc kubenswrapper[4565]: I1125 10:17:28.097124 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:17:28 crc kubenswrapper[4565]: E1125 10:17:28.098288 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:17:43 crc kubenswrapper[4565]: I1125 10:17:43.098216 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:17:43 crc kubenswrapper[4565]: E1125 10:17:43.099368 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:17:56 crc kubenswrapper[4565]: I1125 10:17:56.098619 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:17:56 crc kubenswrapper[4565]: E1125 10:17:56.099505 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:18:10 crc kubenswrapper[4565]: I1125 10:18:10.097560 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:18:10 crc kubenswrapper[4565]: E1125 10:18:10.098533 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:18:22 crc kubenswrapper[4565]: I1125 10:18:22.097703 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:18:22 crc kubenswrapper[4565]: E1125 10:18:22.098687 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:18:34 crc kubenswrapper[4565]: I1125 10:18:34.097867 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:18:34 crc kubenswrapper[4565]: E1125 10:18:34.098739 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:18:49 crc kubenswrapper[4565]: I1125 10:18:49.096769 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:18:49 crc kubenswrapper[4565]: E1125 10:18:49.098254 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:19:00 crc kubenswrapper[4565]: I1125 10:19:00.097360 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:19:00 crc kubenswrapper[4565]: E1125 10:19:00.098186 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:19:15 crc kubenswrapper[4565]: I1125 10:19:15.097552 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:19:15 crc kubenswrapper[4565]: E1125 10:19:15.098541 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:19:27 crc kubenswrapper[4565]: I1125 10:19:27.107242 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:19:27 crc kubenswrapper[4565]: E1125 10:19:27.108079 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:19:41 crc kubenswrapper[4565]: I1125 10:19:41.100043 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:19:41 crc kubenswrapper[4565]: E1125 10:19:41.100802 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:19:56 crc kubenswrapper[4565]: I1125 10:19:56.097613 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:19:56 crc kubenswrapper[4565]: E1125 10:19:56.098462 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:20:10 crc kubenswrapper[4565]: I1125 10:20:10.098050 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:20:10 crc kubenswrapper[4565]: E1125 10:20:10.099155 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:20:21 crc kubenswrapper[4565]: I1125 10:20:21.097888 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:20:21 crc kubenswrapper[4565]: E1125 10:20:21.098673 4565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-r28bt_openshift-machine-config-operator(80bad26f-53b0-48f7-9ac4-110d3d8a475d)\"" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" podUID="80bad26f-53b0-48f7-9ac4-110d3d8a475d" Nov 25 10:20:33 crc kubenswrapper[4565]: I1125 10:20:33.098645 4565 scope.go:117] "RemoveContainer" containerID="ab933f873b45da0d1a4a36833790d65354066fe2470f5f27f2e8d3af0f6ada02" Nov 25 10:20:33 crc kubenswrapper[4565]: I1125 10:20:33.349973 4565 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-r28bt" event={"ID":"80bad26f-53b0-48f7-9ac4-110d3d8a475d","Type":"ContainerStarted","Data":"54fd925bf56a9f8c5ef4e4dcecc17a260d17f6f2c7186012978f8b0d584d46bd"}